Feb 23 10:00:56 localhost kernel: Linux version 5.14.0-681.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Wed Feb 11 20:19:22 UTC 2026
Feb 23 10:00:56 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 23 10:00:56 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 23 10:00:56 localhost kernel: BIOS-provided physical RAM map:
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 23 10:00:56 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 23 10:00:56 localhost kernel: NX (Execute Disable) protection: active
Feb 23 10:00:56 localhost kernel: APIC: Static calls initialized
Feb 23 10:00:56 localhost kernel: SMBIOS 2.8 present.
Feb 23 10:00:56 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 23 10:00:56 localhost kernel: Hypervisor detected: KVM
Feb 23 10:00:56 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 23 10:00:56 localhost kernel: kvm-clock: using sched offset of 9021053839 cycles
Feb 23 10:00:56 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 23 10:00:56 localhost kernel: tsc: Detected 2800.000 MHz processor
Feb 23 10:00:56 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 23 10:00:56 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 23 10:00:56 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 23 10:00:56 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 23 10:00:56 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 23 10:00:56 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 23 10:00:56 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 23 10:00:56 localhost kernel: Using GB pages for direct mapping
Feb 23 10:00:56 localhost kernel: RAMDISK: [mem 0x1b6f6000-0x29b72fff]
Feb 23 10:00:56 localhost kernel: ACPI: Early table checksum verification disabled
Feb 23 10:00:56 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 23 10:00:56 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 10:00:56 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 10:00:56 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 10:00:56 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 23 10:00:56 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 10:00:56 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 10:00:56 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 23 10:00:56 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 23 10:00:56 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 23 10:00:56 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 23 10:00:56 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 23 10:00:56 localhost kernel: No NUMA configuration found
Feb 23 10:00:56 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 23 10:00:56 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Feb 23 10:00:56 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 23 10:00:56 localhost kernel: Zone ranges:
Feb 23 10:00:56 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 23 10:00:56 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 23 10:00:56 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 23 10:00:56 localhost kernel:   Device   empty
Feb 23 10:00:56 localhost kernel: Movable zone start for each node
Feb 23 10:00:56 localhost kernel: Early memory node ranges
Feb 23 10:00:56 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 23 10:00:56 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 23 10:00:56 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 23 10:00:56 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 23 10:00:56 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 23 10:00:56 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 23 10:00:56 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 23 10:00:56 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 23 10:00:56 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 23 10:00:56 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 23 10:00:56 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 23 10:00:56 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 23 10:00:56 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 23 10:00:56 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 23 10:00:56 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 23 10:00:56 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 23 10:00:56 localhost kernel: TSC deadline timer available
Feb 23 10:00:56 localhost kernel: CPU topo: Max. logical packages:   8
Feb 23 10:00:56 localhost kernel: CPU topo: Max. logical dies:       8
Feb 23 10:00:56 localhost kernel: CPU topo: Max. dies per package:   1
Feb 23 10:00:56 localhost kernel: CPU topo: Max. threads per core:   1
Feb 23 10:00:56 localhost kernel: CPU topo: Num. cores per package:     1
Feb 23 10:00:56 localhost kernel: CPU topo: Num. threads per package:   1
Feb 23 10:00:56 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 23 10:00:56 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 23 10:00:56 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 23 10:00:56 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 23 10:00:56 localhost kernel: Booting paravirtualized kernel on KVM
Feb 23 10:00:56 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 23 10:00:56 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 23 10:00:56 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 23 10:00:56 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 23 10:00:56 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 23 10:00:56 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 23 10:00:56 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 23 10:00:56 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64", will be passed to user space.
Feb 23 10:00:56 localhost kernel: random: crng init done
Feb 23 10:00:56 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 23 10:00:56 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 23 10:00:56 localhost kernel: Fallback order for Node 0: 0 
Feb 23 10:00:56 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 23 10:00:56 localhost kernel: Policy zone: Normal
Feb 23 10:00:56 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 23 10:00:56 localhost kernel: software IO TLB: area num 8.
Feb 23 10:00:56 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 23 10:00:56 localhost kernel: ftrace: allocating 49565 entries in 194 pages
Feb 23 10:00:56 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 23 10:00:56 localhost kernel: Dynamic Preempt: voluntary
Feb 23 10:00:56 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 23 10:00:56 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 23 10:00:56 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 23 10:00:56 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 23 10:00:56 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 23 10:00:56 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 23 10:00:56 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 23 10:00:56 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 23 10:00:56 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 23 10:00:56 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 23 10:00:56 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 23 10:00:56 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 23 10:00:56 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 23 10:00:56 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 23 10:00:56 localhost kernel: Console: colour VGA+ 80x25
Feb 23 10:00:56 localhost kernel: printk: console [ttyS0] enabled
Feb 23 10:00:56 localhost kernel: ACPI: Core revision 20230331
Feb 23 10:00:56 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 23 10:00:56 localhost kernel: x2apic enabled
Feb 23 10:00:56 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 23 10:00:56 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 23 10:00:56 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Feb 23 10:00:56 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 23 10:00:56 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 23 10:00:56 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 23 10:00:56 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 23 10:00:56 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 23 10:00:56 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 23 10:00:56 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 23 10:00:56 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 23 10:00:56 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 23 10:00:56 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 23 10:00:56 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 23 10:00:56 localhost kernel: active return thunk: retbleed_return_thunk
Feb 23 10:00:56 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 23 10:00:56 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 23 10:00:56 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 23 10:00:56 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 23 10:00:56 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 23 10:00:56 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 23 10:00:56 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 23 10:00:56 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 23 10:00:56 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 23 10:00:56 localhost kernel: landlock: Up and running.
Feb 23 10:00:56 localhost kernel: Yama: becoming mindful.
Feb 23 10:00:56 localhost kernel: SELinux:  Initializing.
Feb 23 10:00:56 localhost kernel: LSM support for eBPF active
Feb 23 10:00:56 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 23 10:00:56 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 23 10:00:56 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 23 10:00:56 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 23 10:00:56 localhost kernel: ... version:                0
Feb 23 10:00:56 localhost kernel: ... bit width:              48
Feb 23 10:00:56 localhost kernel: ... generic registers:      6
Feb 23 10:00:56 localhost kernel: ... value mask:             0000ffffffffffff
Feb 23 10:00:56 localhost kernel: ... max period:             00007fffffffffff
Feb 23 10:00:56 localhost kernel: ... fixed-purpose events:   0
Feb 23 10:00:56 localhost kernel: ... event mask:             000000000000003f
Feb 23 10:00:56 localhost kernel: signal: max sigframe size: 1776
Feb 23 10:00:56 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 23 10:00:56 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 23 10:00:56 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 23 10:00:56 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 23 10:00:56 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 23 10:00:56 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 23 10:00:56 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Feb 23 10:00:56 localhost kernel: node 0 deferred pages initialised in 12ms
Feb 23 10:00:56 localhost kernel: Memory: 7617752K/8388068K available (16384K kernel code, 5795K rwdata, 13948K rodata, 4204K init, 7180K bss, 764380K reserved, 0K cma-reserved)
Feb 23 10:00:56 localhost kernel: devtmpfs: initialized
Feb 23 10:00:56 localhost kernel: x86/mm: Memory block size: 128MB
Feb 23 10:00:56 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 23 10:00:56 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 23 10:00:56 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 23 10:00:56 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 23 10:00:56 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 23 10:00:56 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 23 10:00:56 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 23 10:00:56 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 23 10:00:56 localhost kernel: audit: type=2000 audit(1771840855.173:1): state=initialized audit_enabled=0 res=1
Feb 23 10:00:56 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 23 10:00:56 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 23 10:00:56 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 23 10:00:56 localhost kernel: cpuidle: using governor menu
Feb 23 10:00:56 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 23 10:00:56 localhost kernel: PCI: Using configuration type 1 for base access
Feb 23 10:00:56 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 23 10:00:56 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 23 10:00:56 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 23 10:00:56 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 23 10:00:56 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 23 10:00:56 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 23 10:00:56 localhost kernel: Demotion targets for Node 0: null
Feb 23 10:00:56 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 23 10:00:56 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 23 10:00:56 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 23 10:00:56 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 23 10:00:56 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 23 10:00:56 localhost kernel: ACPI: Interpreter enabled
Feb 23 10:00:56 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 23 10:00:56 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 23 10:00:56 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 23 10:00:56 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 23 10:00:56 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 23 10:00:56 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 23 10:00:56 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [3] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [4] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [5] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [6] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [7] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [8] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [9] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [10] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [11] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [12] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [13] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [14] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [15] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [16] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [17] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [18] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [19] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [20] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [21] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [22] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [23] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [24] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [25] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [26] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [27] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [28] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [29] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [30] registered
Feb 23 10:00:56 localhost kernel: acpiphp: Slot [31] registered
Feb 23 10:00:56 localhost kernel: PCI host bridge to bus 0000:00
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 23 10:00:56 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 23 10:00:56 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 23 10:00:56 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 23 10:00:56 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 23 10:00:56 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 23 10:00:56 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 23 10:00:56 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 23 10:00:56 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 23 10:00:56 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 23 10:00:56 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 23 10:00:56 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 23 10:00:56 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 23 10:00:56 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 23 10:00:56 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 23 10:00:56 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 23 10:00:56 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 23 10:00:56 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 23 10:00:56 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 23 10:00:56 localhost kernel: iommu: Default domain type: Translated
Feb 23 10:00:56 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 23 10:00:56 localhost kernel: SCSI subsystem initialized
Feb 23 10:00:56 localhost kernel: ACPI: bus type USB registered
Feb 23 10:00:56 localhost kernel: usbcore: registered new interface driver usbfs
Feb 23 10:00:56 localhost kernel: usbcore: registered new interface driver hub
Feb 23 10:00:56 localhost kernel: usbcore: registered new device driver usb
Feb 23 10:00:56 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 23 10:00:56 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 23 10:00:56 localhost kernel: PTP clock support registered
Feb 23 10:00:56 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 23 10:00:56 localhost kernel: NetLabel: Initializing
Feb 23 10:00:56 localhost kernel: NetLabel:  domain hash size = 128
Feb 23 10:00:56 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 23 10:00:56 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 23 10:00:56 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 23 10:00:56 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 23 10:00:56 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 23 10:00:56 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 23 10:00:56 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 23 10:00:56 localhost kernel: vgaarb: loaded
Feb 23 10:00:56 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 23 10:00:56 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 23 10:00:56 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 23 10:00:56 localhost kernel: pnp: PnP ACPI init
Feb 23 10:00:56 localhost kernel: pnp 00:03: [dma 2]
Feb 23 10:00:56 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 23 10:00:56 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 23 10:00:56 localhost kernel: NET: Registered PF_INET protocol family
Feb 23 10:00:56 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 23 10:00:56 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 23 10:00:56 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 23 10:00:56 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 23 10:00:56 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 23 10:00:56 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 23 10:00:56 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 23 10:00:56 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 23 10:00:56 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 23 10:00:56 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 23 10:00:56 localhost kernel: NET: Registered PF_XDP protocol family
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 23 10:00:56 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 23 10:00:56 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 23 10:00:56 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 23 10:00:56 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 23357 usecs
Feb 23 10:00:56 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 23 10:00:56 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 23 10:00:56 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 23 10:00:56 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 23 10:00:56 localhost kernel: ACPI: bus type thunderbolt registered
Feb 23 10:00:56 localhost kernel: Initialise system trusted keyrings
Feb 23 10:00:56 localhost kernel: Key type blacklist registered
Feb 23 10:00:56 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 23 10:00:56 localhost kernel: zbud: loaded
Feb 23 10:00:56 localhost kernel: integrity: Platform Keyring initialized
Feb 23 10:00:56 localhost kernel: integrity: Machine keyring initialized
Feb 23 10:00:56 localhost kernel: Freeing initrd memory: 233972K
Feb 23 10:00:56 localhost kernel: NET: Registered PF_ALG protocol family
Feb 23 10:00:56 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 23 10:00:56 localhost kernel: Key type asymmetric registered
Feb 23 10:00:56 localhost kernel: Asymmetric key parser 'x509' registered
Feb 23 10:00:56 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 23 10:00:56 localhost kernel: io scheduler mq-deadline registered
Feb 23 10:00:56 localhost kernel: io scheduler kyber registered
Feb 23 10:00:56 localhost kernel: io scheduler bfq registered
Feb 23 10:00:56 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 23 10:00:56 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 23 10:00:56 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 23 10:00:56 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 23 10:00:56 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 23 10:00:56 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 23 10:00:56 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 23 10:00:56 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 23 10:00:56 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 23 10:00:56 localhost kernel: Non-volatile memory driver v1.3
Feb 23 10:00:56 localhost kernel: rdac: device handler registered
Feb 23 10:00:56 localhost kernel: hp_sw: device handler registered
Feb 23 10:00:56 localhost kernel: emc: device handler registered
Feb 23 10:00:56 localhost kernel: alua: device handler registered
Feb 23 10:00:56 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 23 10:00:56 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 23 10:00:56 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 23 10:00:56 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 23 10:00:56 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 23 10:00:56 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 23 10:00:56 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 23 10:00:56 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-681.el9.x86_64 uhci_hcd
Feb 23 10:00:56 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 23 10:00:56 localhost kernel: hub 1-0:1.0: USB hub found
Feb 23 10:00:56 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 23 10:00:56 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 23 10:00:56 localhost kernel: usbserial: USB Serial support registered for generic
Feb 23 10:00:56 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 23 10:00:56 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 23 10:00:56 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 23 10:00:56 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 23 10:00:56 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 23 10:00:56 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 23 10:00:56 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 23 10:00:56 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-23T10:00:55 UTC (1771840855)
Feb 23 10:00:56 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 23 10:00:56 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 23 10:00:56 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 23 10:00:56 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 23 10:00:56 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 23 10:00:56 localhost kernel: usbcore: registered new interface driver usbhid
Feb 23 10:00:56 localhost kernel: usbhid: USB HID core driver
Feb 23 10:00:56 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 23 10:00:56 localhost kernel: Initializing XFRM netlink socket
Feb 23 10:00:56 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 23 10:00:56 localhost kernel: Segment Routing with IPv6
Feb 23 10:00:56 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 23 10:00:56 localhost kernel: mpls_gso: MPLS GSO support
Feb 23 10:00:56 localhost kernel: IPI shorthand broadcast: enabled
Feb 23 10:00:56 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 23 10:00:56 localhost kernel: AES CTR mode by8 optimization enabled
Feb 23 10:00:56 localhost kernel: sched_clock: Marking stable (1094001810, 147614550)->(1310569180, -68952820)
Feb 23 10:00:56 localhost kernel: registered taskstats version 1
Feb 23 10:00:56 localhost kernel: Loading compiled-in X.509 certificates
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4fde1469d8033882223ec575c1a1c1b88c9a497b'
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 23 10:00:56 localhost kernel: Demotion targets for Node 0: null
Feb 23 10:00:56 localhost kernel: page_owner is disabled
Feb 23 10:00:56 localhost kernel: Key type .fscrypt registered
Feb 23 10:00:56 localhost kernel: Key type fscrypt-provisioning registered
Feb 23 10:00:56 localhost kernel: Key type big_key registered
Feb 23 10:00:56 localhost kernel: Key type encrypted registered
Feb 23 10:00:56 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 23 10:00:56 localhost kernel: Loading compiled-in module X.509 certificates
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4fde1469d8033882223ec575c1a1c1b88c9a497b'
Feb 23 10:00:56 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 23 10:00:56 localhost kernel: ima: No architecture policies found
Feb 23 10:00:56 localhost kernel: evm: Initialising EVM extended attributes:
Feb 23 10:00:56 localhost kernel: evm: security.selinux
Feb 23 10:00:56 localhost kernel: evm: security.SMACK64 (disabled)
Feb 23 10:00:56 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 23 10:00:56 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 23 10:00:56 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 23 10:00:56 localhost kernel: evm: security.apparmor (disabled)
Feb 23 10:00:56 localhost kernel: evm: security.ima
Feb 23 10:00:56 localhost kernel: evm: security.capability
Feb 23 10:00:56 localhost kernel: evm: HMAC attrs: 0x1
Feb 23 10:00:56 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 23 10:00:56 localhost kernel: Running certificate verification RSA selftest
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 23 10:00:56 localhost kernel: Running certificate verification ECDSA selftest
Feb 23 10:00:56 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 23 10:00:56 localhost kernel: clk: Disabling unused clocks
Feb 23 10:00:56 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 23 10:00:56 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 23 10:00:56 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 23 10:00:56 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 388K
Feb 23 10:00:56 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 23 10:00:56 localhost kernel: Run /init as init process
Feb 23 10:00:56 localhost kernel:   with arguments:
Feb 23 10:00:56 localhost kernel:     /init
Feb 23 10:00:56 localhost kernel:   with environment:
Feb 23 10:00:56 localhost kernel:     HOME=/
Feb 23 10:00:56 localhost kernel:     TERM=linux
Feb 23 10:00:56 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64
Feb 23 10:00:56 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 23 10:00:56 localhost systemd[1]: Detected virtualization kvm.
Feb 23 10:00:56 localhost systemd[1]: Detected architecture x86-64.
Feb 23 10:00:56 localhost systemd[1]: Running in initrd.
Feb 23 10:00:56 localhost systemd[1]: No hostname configured, using default hostname.
Feb 23 10:00:56 localhost systemd[1]: Hostname set to <localhost>.
Feb 23 10:00:56 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 23 10:00:56 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 23 10:00:56 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 23 10:00:56 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 23 10:00:56 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 23 10:00:56 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 23 10:00:56 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 23 10:00:56 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 23 10:00:56 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 23 10:00:56 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 23 10:00:56 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 23 10:00:56 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 23 10:00:56 localhost systemd[1]: Reached target Local File Systems.
Feb 23 10:00:56 localhost systemd[1]: Reached target Path Units.
Feb 23 10:00:56 localhost systemd[1]: Reached target Slice Units.
Feb 23 10:00:56 localhost systemd[1]: Reached target Swaps.
Feb 23 10:00:56 localhost systemd[1]: Reached target Timer Units.
Feb 23 10:00:56 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 23 10:00:56 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 23 10:00:56 localhost systemd[1]: Listening on Journal Socket.
Feb 23 10:00:56 localhost systemd[1]: Listening on udev Control Socket.
Feb 23 10:00:56 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 23 10:00:56 localhost systemd[1]: Reached target Socket Units.
Feb 23 10:00:56 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 23 10:00:56 localhost systemd[1]: Starting Journal Service...
Feb 23 10:00:56 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 23 10:00:56 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 23 10:00:56 localhost systemd[1]: Starting Create System Users...
Feb 23 10:00:56 localhost systemd[1]: Starting Setup Virtual Console...
Feb 23 10:00:56 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 23 10:00:56 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 23 10:00:56 localhost systemd[1]: Finished Create System Users.
Feb 23 10:00:56 localhost systemd-journald[304]: Journal started
Feb 23 10:00:56 localhost systemd-journald[304]: Runtime Journal (/run/log/journal/07d229308d234ccfb924bb75b3355502) is 8.0M, max 153.6M, 145.6M free.
Feb 23 10:00:56 localhost systemd-sysusers[307]: Creating group 'users' with GID 100.
Feb 23 10:00:56 localhost systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Feb 23 10:00:56 localhost systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 23 10:00:56 localhost systemd[1]: Started Journal Service.
Feb 23 10:00:56 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 23 10:00:56 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 23 10:00:56 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 23 10:00:56 localhost systemd[1]: Finished Setup Virtual Console.
Feb 23 10:00:56 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 23 10:00:56 localhost systemd[1]: Starting dracut cmdline hook...
Feb 23 10:00:56 localhost dracut-cmdline[322]: dracut-9 dracut-057-110.git20260130.el9
Feb 23 10:00:56 localhost dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 23 10:00:56 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 23 10:00:56 localhost systemd[1]: Finished dracut cmdline hook.
Feb 23 10:00:56 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 23 10:00:56 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 23 10:00:56 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 23 10:00:56 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 23 10:00:56 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 23 10:00:56 localhost kernel: RPC: Registered udp transport module.
Feb 23 10:00:56 localhost kernel: RPC: Registered tcp transport module.
Feb 23 10:00:56 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 23 10:00:56 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 23 10:00:56 localhost rpc.statd[440]: Version 2.5.4 starting
Feb 23 10:00:56 localhost rpc.statd[440]: Initializing NSM state
Feb 23 10:00:56 localhost rpc.idmapd[445]: Setting log level to 0
Feb 23 10:00:56 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 23 10:00:56 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 23 10:00:56 localhost systemd-udevd[458]: Using default interface naming scheme 'rhel-9.0'.
Feb 23 10:00:56 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 23 10:00:56 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 23 10:00:56 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 23 10:00:56 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 23 10:00:56 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 23 10:00:56 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 23 10:00:56 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 23 10:00:56 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 23 10:00:56 localhost systemd[1]: Reached target Network.
Feb 23 10:00:56 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 23 10:00:56 localhost systemd[1]: Starting dracut initqueue hook...
Feb 23 10:00:56 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 23 10:00:56 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 23 10:00:56 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 23 10:00:56 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 23 10:00:56 localhost kernel:  vda: vda1
Feb 23 10:00:56 localhost kernel: libata version 3.00 loaded.
Feb 23 10:00:56 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 23 10:00:56 localhost kernel: scsi host0: ata_piix
Feb 23 10:00:56 localhost kernel: scsi host1: ata_piix
Feb 23 10:00:56 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 23 10:00:56 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 23 10:00:56 localhost kernel: ACPI: bus type drm_connector registered
Feb 23 10:00:56 localhost systemd[1]: Found device /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c.
Feb 23 10:00:56 localhost systemd[1]: Reached target Initrd Root Device.
Feb 23 10:00:57 localhost kernel: ata1: found unknown device (class 0)
Feb 23 10:00:57 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 23 10:00:57 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 23 10:00:57 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 23 10:00:57 localhost systemd-udevd[474]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:00:57 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 23 10:00:57 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 23 10:00:57 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 23 10:00:57 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 23 10:00:57 localhost systemd[1]: Reached target System Initialization.
Feb 23 10:00:57 localhost systemd[1]: Reached target Basic System.
Feb 23 10:00:57 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 23 10:00:57 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 23 10:00:57 localhost kernel: Console: switching to colour dummy device 80x25
Feb 23 10:00:57 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 23 10:00:57 localhost kernel: [drm] features: -context_init
Feb 23 10:00:57 localhost kernel: [drm] number of scanouts: 1
Feb 23 10:00:57 localhost kernel: [drm] number of cap sets: 0
Feb 23 10:00:57 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 23 10:00:57 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 23 10:00:57 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 23 10:00:57 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 23 10:00:57 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 23 10:00:57 localhost systemd[1]: Finished dracut initqueue hook.
Feb 23 10:00:57 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 23 10:00:57 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 23 10:00:57 localhost systemd[1]: Reached target Remote File Systems.
Feb 23 10:00:57 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 23 10:00:57 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 23 10:00:57 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c...
Feb 23 10:00:57 localhost systemd-fsck[562]: /usr/sbin/fsck.xfs: XFS file system.
Feb 23 10:00:57 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c.
Feb 23 10:00:57 localhost systemd[1]: Mounting /sysroot...
Feb 23 10:00:57 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 23 10:00:57 localhost kernel: XFS (vda1): Mounting V5 Filesystem 9d578f93-c4e9-4172-8459-ef150e54751c
Feb 23 10:00:57 localhost kernel: XFS (vda1): Ending clean mount
Feb 23 10:00:57 localhost systemd[1]: Mounted /sysroot.
Feb 23 10:00:57 localhost systemd[1]: Reached target Initrd Root File System.
Feb 23 10:00:57 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 23 10:00:57 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 23 10:00:57 localhost systemd[1]: Reached target Initrd File Systems.
Feb 23 10:00:57 localhost systemd[1]: Reached target Initrd Default Target.
Feb 23 10:00:57 localhost systemd[1]: Starting dracut mount hook...
Feb 23 10:00:57 localhost systemd[1]: Finished dracut mount hook.
Feb 23 10:00:57 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 23 10:00:57 localhost rpc.idmapd[445]: exiting on signal 15
Feb 23 10:00:57 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 23 10:00:57 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 23 10:00:57 localhost systemd[1]: Stopped target Network.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Timer Units.
Feb 23 10:00:57 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 23 10:00:57 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Basic System.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Path Units.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Remote File Systems.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Slice Units.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Socket Units.
Feb 23 10:00:57 localhost systemd[1]: Stopped target System Initialization.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Local File Systems.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Swaps.
Feb 23 10:00:57 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped dracut mount hook.
Feb 23 10:00:57 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 23 10:00:57 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 23 10:00:57 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 23 10:00:57 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 23 10:00:57 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 23 10:00:57 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 23 10:00:57 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 23 10:00:57 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 23 10:00:57 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 23 10:00:57 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 23 10:00:57 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 23 10:00:57 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 23 10:00:57 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Closed udev Control Socket.
Feb 23 10:00:57 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Closed udev Kernel Socket.
Feb 23 10:00:57 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 23 10:00:57 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 23 10:00:57 localhost systemd[1]: Starting Cleanup udev Database...
Feb 23 10:00:57 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 23 10:00:57 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 23 10:00:57 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Stopped Create System Users.
Feb 23 10:00:57 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 23 10:00:57 localhost systemd[1]: Finished Cleanup udev Database.
Feb 23 10:00:57 localhost systemd[1]: Reached target Switch Root.
Feb 23 10:00:57 localhost systemd[1]: Starting Switch Root...
Feb 23 10:00:58 localhost systemd[1]: Switching root.
Feb 23 10:00:58 localhost systemd-journald[304]: Journal stopped
Feb 23 10:00:58 localhost systemd-journald[304]: Received SIGTERM from PID 1 (systemd).
Feb 23 10:00:58 localhost kernel: audit: type=1404 audit(1771840858.212:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 23 10:00:58 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:00:58 localhost kernel: SELinux:  policy capability open_perms=1
Feb 23 10:00:58 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:00:58 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:00:58 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:00:58 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:00:58 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:00:58 localhost kernel: audit: type=1403 audit(1771840858.322:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 23 10:00:58 localhost systemd[1]: Successfully loaded SELinux policy in 112.705ms.
Feb 23 10:00:58 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.333ms.
Feb 23 10:00:58 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 23 10:00:58 localhost systemd[1]: Detected virtualization kvm.
Feb 23 10:00:58 localhost systemd[1]: Detected architecture x86-64.
Feb 23 10:00:58 localhost systemd-rc-local-generator[647]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:00:58 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 23 10:00:58 localhost systemd[1]: Stopped Switch Root.
Feb 23 10:00:58 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 23 10:00:58 localhost systemd[1]: Created slice Slice /system/getty.
Feb 23 10:00:58 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 23 10:00:58 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 23 10:00:58 localhost systemd[1]: Created slice User and Session Slice.
Feb 23 10:00:58 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 23 10:00:58 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 23 10:00:58 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 23 10:00:58 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 23 10:00:58 localhost systemd[1]: Stopped target Switch Root.
Feb 23 10:00:58 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 23 10:00:58 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 23 10:00:58 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 23 10:00:58 localhost systemd[1]: Reached target Path Units.
Feb 23 10:00:58 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 23 10:00:58 localhost systemd[1]: Reached target Slice Units.
Feb 23 10:00:58 localhost systemd[1]: Reached target Swaps.
Feb 23 10:00:58 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 23 10:00:58 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 23 10:00:58 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 23 10:00:58 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 23 10:00:58 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 23 10:00:58 localhost systemd[1]: Listening on udev Control Socket.
Feb 23 10:00:58 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 23 10:00:58 localhost systemd[1]: Mounting Huge Pages File System...
Feb 23 10:00:58 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 23 10:00:58 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 23 10:00:58 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 23 10:00:58 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 23 10:00:58 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 23 10:00:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 23 10:00:58 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 23 10:00:58 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 23 10:00:58 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 23 10:00:58 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 23 10:00:58 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 23 10:00:58 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 23 10:00:58 localhost systemd[1]: Stopped Journal Service.
Feb 23 10:00:58 localhost systemd[1]: Starting Journal Service...
Feb 23 10:00:58 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 23 10:00:58 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 23 10:00:58 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 23 10:00:58 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 23 10:00:58 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 23 10:00:58 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 23 10:00:58 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 23 10:00:58 localhost kernel: fuse: init (API version 7.37)
Feb 23 10:00:58 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 23 10:00:58 localhost systemd[1]: Mounted Huge Pages File System.
Feb 23 10:00:58 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 23 10:00:58 localhost systemd-journald[695]: Journal started
Feb 23 10:00:58 localhost systemd-journald[695]: Runtime Journal (/run/log/journal/621707288f5710e1387f73ed6d90e964) is 8.0M, max 153.6M, 145.6M free.
Feb 23 10:00:58 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 23 10:00:58 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 23 10:00:58 localhost systemd[1]: Started Journal Service.
Feb 23 10:00:58 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 23 10:00:58 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 23 10:00:58 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 23 10:00:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 23 10:00:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 23 10:00:58 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 23 10:00:58 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 23 10:00:58 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 23 10:00:58 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 23 10:00:58 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 23 10:00:58 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 23 10:00:58 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 23 10:00:58 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 23 10:00:58 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 23 10:00:58 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 23 10:00:58 localhost systemd[1]: Mounting FUSE Control File System...
Feb 23 10:00:58 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 23 10:00:58 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 23 10:00:58 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 23 10:00:58 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 23 10:00:58 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 23 10:00:58 localhost systemd[1]: Starting Create System Users...
Feb 23 10:00:58 localhost systemd[1]: Mounted FUSE Control File System.
Feb 23 10:00:58 localhost systemd-journald[695]: Runtime Journal (/run/log/journal/621707288f5710e1387f73ed6d90e964) is 8.0M, max 153.6M, 145.6M free.
Feb 23 10:00:58 localhost systemd-journald[695]: Received client request to flush runtime journal.
Feb 23 10:00:58 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 23 10:00:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 23 10:00:58 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 23 10:00:58 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 23 10:00:58 localhost systemd[1]: Finished Create System Users.
Feb 23 10:00:58 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 23 10:00:58 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 23 10:00:58 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 23 10:00:58 localhost systemd[1]: Reached target Local File Systems.
Feb 23 10:00:59 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 23 10:00:59 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 23 10:00:59 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 23 10:00:59 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 23 10:00:59 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 23 10:00:59 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 23 10:00:59 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 23 10:00:59 localhost bootctl[713]: Couldn't find EFI system partition, skipping.
Feb 23 10:00:59 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 23 10:00:59 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 23 10:00:59 localhost systemd[1]: Starting Security Auditing Service...
Feb 23 10:00:59 localhost systemd[1]: Starting RPC Bind...
Feb 23 10:00:59 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 23 10:00:59 localhost auditd[719]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 23 10:00:59 localhost auditd[719]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 23 10:00:59 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 23 10:00:59 localhost systemd[1]: Started RPC Bind.
Feb 23 10:00:59 localhost augenrules[724]: /sbin/augenrules: No change
Feb 23 10:00:59 localhost augenrules[739]: No rules
Feb 23 10:00:59 localhost augenrules[739]: enabled 1
Feb 23 10:00:59 localhost augenrules[739]: failure 1
Feb 23 10:00:59 localhost augenrules[739]: pid 719
Feb 23 10:00:59 localhost augenrules[739]: rate_limit 0
Feb 23 10:00:59 localhost augenrules[739]: backlog_limit 8192
Feb 23 10:00:59 localhost augenrules[739]: lost 0
Feb 23 10:00:59 localhost augenrules[739]: backlog 0
Feb 23 10:00:59 localhost augenrules[739]: backlog_wait_time 60000
Feb 23 10:00:59 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 23 10:00:59 localhost augenrules[739]: enabled 1
Feb 23 10:00:59 localhost augenrules[739]: failure 1
Feb 23 10:00:59 localhost augenrules[739]: pid 719
Feb 23 10:00:59 localhost augenrules[739]: rate_limit 0
Feb 23 10:00:59 localhost augenrules[739]: backlog_limit 8192
Feb 23 10:00:59 localhost augenrules[739]: lost 0
Feb 23 10:00:59 localhost augenrules[739]: backlog 0
Feb 23 10:00:59 localhost augenrules[739]: backlog_wait_time 60000
Feb 23 10:00:59 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 23 10:00:59 localhost augenrules[739]: enabled 1
Feb 23 10:00:59 localhost augenrules[739]: failure 1
Feb 23 10:00:59 localhost augenrules[739]: pid 719
Feb 23 10:00:59 localhost augenrules[739]: rate_limit 0
Feb 23 10:00:59 localhost augenrules[739]: backlog_limit 8192
Feb 23 10:00:59 localhost augenrules[739]: lost 0
Feb 23 10:00:59 localhost augenrules[739]: backlog 0
Feb 23 10:00:59 localhost augenrules[739]: backlog_wait_time 60000
Feb 23 10:00:59 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 23 10:00:59 localhost systemd[1]: Started Security Auditing Service.
Feb 23 10:00:59 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 23 10:00:59 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 23 10:00:59 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 23 10:00:59 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 23 10:00:59 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 23 10:00:59 localhost systemd[1]: Starting Update is Completed...
Feb 23 10:00:59 localhost systemd[1]: Finished Update is Completed.
Feb 23 10:00:59 localhost systemd-udevd[747]: Using default interface naming scheme 'rhel-9.0'.
Feb 23 10:00:59 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 23 10:00:59 localhost systemd[1]: Reached target System Initialization.
Feb 23 10:00:59 localhost systemd[1]: Started dnf makecache --timer.
Feb 23 10:00:59 localhost systemd[1]: Started Daily rotation of log files.
Feb 23 10:00:59 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 23 10:00:59 localhost systemd[1]: Reached target Timer Units.
Feb 23 10:00:59 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 23 10:00:59 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 23 10:00:59 localhost systemd[1]: Reached target Socket Units.
Feb 23 10:00:59 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 23 10:00:59 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 23 10:00:59 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 23 10:00:59 localhost systemd-udevd[763]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:00:59 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 23 10:00:59 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 23 10:00:59 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 23 10:00:59 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 23 10:00:59 localhost systemd[1]: Reached target Basic System.
Feb 23 10:00:59 localhost dbus-broker-lau[781]: Ready
Feb 23 10:00:59 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 23 10:00:59 localhost systemd[1]: Starting NTP client/server...
Feb 23 10:00:59 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 23 10:00:59 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 23 10:00:59 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 23 10:00:59 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 23 10:00:59 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 23 10:00:59 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 23 10:00:59 localhost systemd[1]: Started irqbalance daemon.
Feb 23 10:00:59 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 23 10:00:59 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 10:00:59 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 10:00:59 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 10:00:59 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 23 10:00:59 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 23 10:00:59 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 23 10:00:59 localhost chronyd[807]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 23 10:00:59 localhost systemd[1]: Starting User Login Management...
Feb 23 10:00:59 localhost chronyd[807]: Loaded 0 symmetric keys
Feb 23 10:00:59 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 23 10:00:59 localhost chronyd[807]: Using right/UTC timezone to obtain leap second data
Feb 23 10:00:59 localhost chronyd[807]: Loaded seccomp filter (level 2)
Feb 23 10:00:59 localhost systemd[1]: Started NTP client/server.
Feb 23 10:00:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 23 10:00:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 23 10:00:59 localhost systemd-logind[808]: New seat seat0.
Feb 23 10:00:59 localhost systemd-logind[808]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 23 10:00:59 localhost systemd-logind[808]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 23 10:00:59 localhost systemd[1]: Started User Login Management.
Feb 23 10:00:59 localhost kernel: kvm_amd: TSC scaling supported
Feb 23 10:00:59 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 23 10:00:59 localhost kernel: kvm_amd: Nested Paging enabled
Feb 23 10:00:59 localhost kernel: kvm_amd: LBR virtualization supported
Feb 23 10:00:59 localhost iptables.init[801]: iptables: Applying firewall rules: [  OK  ]
Feb 23 10:00:59 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 23 10:01:00 localhost cloud-init[851]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 23 Feb 2026 10:01:00 +0000. Up 5.92 seconds.
Feb 23 10:01:00 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 23 10:01:00 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 23 10:01:00 localhost systemd[1]: run-cloud\x2dinit-tmp-tmptv1wgtt_.mount: Deactivated successfully.
Feb 23 10:01:00 localhost systemd[1]: Starting Hostname Service...
Feb 23 10:01:00 localhost systemd[1]: Started Hostname Service.
Feb 23 10:01:00 np0005626601.novalocal systemd-hostnamed[865]: Hostname set to <np0005626601.novalocal> (static)
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Reached target Preparation for Network.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Starting Network Manager...
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8447] NetworkManager (version 1.54.3-2.el9) is starting... (boot:9e288ccf-ad11-4627-abc5-9df48b7c9713)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8453] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8593] manager[0x556622fc5000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8636] hostname: hostname: using hostnamed
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8636] hostname: static hostname changed from (none) to "np0005626601.novalocal"
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8641] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8739] manager[0x556622fc5000]: rfkill: Wi-Fi hardware radio set enabled
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8740] manager[0x556622fc5000]: rfkill: WWAN hardware radio set enabled
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8820] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8821] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8822] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8823] manager: Networking is enabled by state file
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8825] settings: Loaded settings plugin: keyfile (internal)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8858] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8881] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8893] dhcp: init: Using DHCP client 'internal'
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8896] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8910] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8922] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8934] device (lo): Activation: starting connection 'lo' (c3e17fe3-3502-4aa2-b43f-fdb973524017)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8942] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8945] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8970] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8975] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8978] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8980] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8982] device (eth0): carrier: link connected
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8986] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8992] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.8997] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9002] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9003] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9006] manager: NetworkManager state is now CONNECTING
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9008] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9015] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9019] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Started Network Manager.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Reached target Network.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9233] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9242] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 23 10:01:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840860.9248] device (lo): Activation: successful, device activated.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Reached target NFS client services.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: Reached target Remote File Systems.
Feb 23 10:01:00 np0005626601.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4282] dhcp4 (eth0): state changed new lease, address=38.102.83.199
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4300] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4343] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4374] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4377] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4382] manager: NetworkManager state is now CONNECTED_SITE
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4387] device (eth0): Activation: successful, device activated.
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4402] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 23 10:01:01 np0005626601.novalocal NetworkManager[869]: <info>  [1771840861.4407] manager: startup complete
Feb 23 10:01:01 np0005626601.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 23 10:01:01 np0005626601.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 23 Feb 2026 10:01:01 +0000. Up 7.28 seconds.
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |  eth0  | True |        38.102.83.199         | 255.255.255.0 | global | fa:16:3e:04:b2:2d |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |  eth0  | True | fe80::f816:3eff:fe04:b22d/64 |       .       |  link  | fa:16:3e:04:b2:2d |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 23 10:01:01 np0005626601.novalocal cloud-init[933]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 23 10:01:02 np0005626601.novalocal useradd[1000]: new group: name=cloud-user, GID=1001
Feb 23 10:01:02 np0005626601.novalocal useradd[1000]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 23 10:01:02 np0005626601.novalocal useradd[1000]: add 'cloud-user' to group 'adm'
Feb 23 10:01:02 np0005626601.novalocal useradd[1000]: add 'cloud-user' to group 'systemd-journal'
Feb 23 10:01:02 np0005626601.novalocal useradd[1000]: add 'cloud-user' to shadow group 'adm'
Feb 23 10:01:02 np0005626601.novalocal useradd[1000]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Generating public/private rsa key pair.
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: The key fingerprint is:
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: SHA256:lh1hHUD6cshQILkvl1ey6v5Vs92SHlVF7AqkC3uUmmI root@np0005626601.novalocal
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: The key's randomart image is:
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: +---[RSA 3072]----+
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |    ......=o.. oo|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |    .. . o .o   o|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |     .. . .+   ..|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |    .  oo==..   o|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |     . .SX+.o. ..|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |    . E.*oo. +.+ |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |     + + .. . = .|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |      .  .   . o |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |     oo..     .  |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: +----[SHA256]-----+
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Generating public/private ecdsa key pair.
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: The key fingerprint is:
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: SHA256:GA4K10Pl2pi5477TM/tHwUCShqDiOl0+5QqL0CSD5nE root@np0005626601.novalocal
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: The key's randomart image is:
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: +---[ECDSA 256]---+
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |  ...ooo.        |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: | . o..o..        |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |+ . +.o  o       |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |+o . X o  o      |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |++oE* = S  .     |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |+*oo +    .      |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |+.+ =..  .       |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |.o +.++   .      |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |. ..=o.=..       |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: +----[SHA256]-----+
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Generating public/private ed25519 key pair.
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: The key fingerprint is:
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: SHA256:gWvc1Rldr24S70KGdnO5EmS/W9eLfChwzOGfsfSICz4 root@np0005626601.novalocal
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: The key's randomart image is:
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: +--[ED25519 256]--+
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |            .. ..|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |       .   . o. .|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |      . . . o   .|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |     . o o .o  . |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |      + S ++o... |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |     .   .o=**+ .|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |         oo+=+@o+|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |        .E.o+X++o|
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: |         ...o=+o |
Feb 23 10:01:02 np0005626601.novalocal cloud-init[933]: +----[SHA256]-----+
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Reached target Network is Online.
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Starting System Logging Service...
Feb 23 10:01:02 np0005626601.novalocal sm-notify[1016]: Version 2.5.4 starting
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Starting Permit User Sessions...
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 23 10:01:02 np0005626601.novalocal sshd[1018]: Server listening on 0.0.0.0 port 22.
Feb 23 10:01:02 np0005626601.novalocal sshd[1018]: Server listening on :: port 22.
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Finished Permit User Sessions.
Feb 23 10:01:02 np0005626601.novalocal systemd[1]: Started Command Scheduler.
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Started Getty on tty1.
Feb 23 10:01:03 np0005626601.novalocal crond[1021]: (CRON) STARTUP (1.5.7)
Feb 23 10:01:03 np0005626601.novalocal crond[1021]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 23 10:01:03 np0005626601.novalocal crond[1021]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 10% if used.)
Feb 23 10:01:03 np0005626601.novalocal crond[1021]: (CRON) INFO (running with inotify support)
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Reached target Login Prompts.
Feb 23 10:01:03 np0005626601.novalocal rsyslogd[1017]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1017" x-info="https://www.rsyslog.com"] start
Feb 23 10:01:03 np0005626601.novalocal rsyslogd[1017]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Started System Logging Service.
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Reached target Multi-User System.
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 23 10:01:03 np0005626601.novalocal rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:01:03 np0005626601.novalocal kdumpctl[1030]: kdump: No kdump initial ramdisk found.
Feb 23 10:01:03 np0005626601.novalocal kdumpctl[1030]: kdump: Rebuilding /boot/initramfs-5.14.0-681.el9.x86_64kdump.img
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1190]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 23 Feb 2026 10:01:03 +0000. Up 8.77 seconds.
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1296]: Unable to negotiate with 38.102.83.114 port 36702: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1312]: Connection closed by 38.102.83.114 port 36712 [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1324]: Unable to negotiate with 38.102.83.114 port 36716: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1331]: Unable to negotiate with 38.102.83.114 port 36728: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1340]: Connection reset by 38.102.83.114 port 36734 [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1360]: Connection closed by 38.102.83.114 port 36744 [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1369]: Unable to negotiate with 38.102.83.114 port 36758: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1262]: Connection closed by 38.102.83.114 port 60538 [preauth]
Feb 23 10:01:03 np0005626601.novalocal sshd-session[1385]: Unable to negotiate with 38.102.83.114 port 36764: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 23 10:01:03 np0005626601.novalocal dracut[1527]: dracut-057-110.git20260130.el9
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1525]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 23 Feb 2026 10:01:03 +0000. Up 9.15 seconds.
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1544]: #############################################################
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1545]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1547]: 256 SHA256:GA4K10Pl2pi5477TM/tHwUCShqDiOl0+5QqL0CSD5nE root@np0005626601.novalocal (ECDSA)
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1550]: 256 SHA256:gWvc1Rldr24S70KGdnO5EmS/W9eLfChwzOGfsfSICz4 root@np0005626601.novalocal (ED25519)
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1554]: 3072 SHA256:lh1hHUD6cshQILkvl1ey6v5Vs92SHlVF7AqkC3uUmmI root@np0005626601.novalocal (RSA)
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1557]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1559]: #############################################################
Feb 23 10:01:03 np0005626601.novalocal dracut[1529]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-681.el9.x86_64kdump.img 5.14.0-681.el9.x86_64
Feb 23 10:01:03 np0005626601.novalocal cloud-init[1525]: Cloud-init v. 24.4-8.el9 finished at Mon, 23 Feb 2026 10:01:03 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.31 seconds
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 23 10:01:03 np0005626601.novalocal systemd[1]: Reached target Cloud-init target.
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: memstrack is not available
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: memstrack is not available
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: *** Including module: systemd ***
Feb 23 10:01:04 np0005626601.novalocal dracut[1529]: *** Including module: fips ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: systemd-initrd ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: i18n ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: drm ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: prefixdevname ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: kernel-modules ***
Feb 23 10:01:05 np0005626601.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: kernel-modules-extra ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: qemu ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: fstab-sys ***
Feb 23 10:01:05 np0005626601.novalocal dracut[1529]: *** Including module: rootfs-block ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: terminfo ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: udev-rules ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: Skipping udev rule: 91-permissions.rules
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: virtiofs ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: dracut-systemd ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: usrmount ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: base ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: fs-lib ***
Feb 23 10:01:06 np0005626601.novalocal dracut[1529]: *** Including module: kdumpbase ***
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:   microcode_ctl module: mangling fw_dir
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]: *** Including module: openssl ***
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]: *** Including module: shutdown ***
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]: *** Including module: squash ***
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]: *** Including modules done ***
Feb 23 10:01:07 np0005626601.novalocal dracut[1529]: *** Installing kernel module dependencies ***
Feb 23 10:01:07 np0005626601.novalocal chronyd[807]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Feb 23 10:01:07 np0005626601.novalocal chronyd[807]: System clock wrong by 1.343873 seconds
Feb 23 10:01:09 np0005626601.novalocal chronyd[807]: System clock was stepped by 1.343873 seconds
Feb 23 10:01:09 np0005626601.novalocal chronyd[807]: System clock TAI offset set to 37 seconds
Feb 23 10:01:09 np0005626601.novalocal dracut[1529]: *** Installing kernel module dependencies done ***
Feb 23 10:01:09 np0005626601.novalocal dracut[1529]: *** Resolving executable dependencies ***
Feb 23 10:01:10 np0005626601.novalocal dracut[1529]: *** Resolving executable dependencies done ***
Feb 23 10:01:10 np0005626601.novalocal dracut[1529]: *** Generating early-microcode cpio image ***
Feb 23 10:01:10 np0005626601.novalocal dracut[1529]: *** Store current command line parameters ***
Feb 23 10:01:10 np0005626601.novalocal dracut[1529]: Stored kernel commandline:
Feb 23 10:01:10 np0005626601.novalocal dracut[1529]: No dracut internal kernel commandline stored in the initramfs
Feb 23 10:01:10 np0005626601.novalocal dracut[1529]: *** Install squash loader ***
Feb 23 10:01:11 np0005626601.novalocal dracut[1529]: *** Squashing the files inside the initramfs ***
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: IRQ 25 affinity is now unmanaged
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: IRQ 31 affinity is now unmanaged
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: IRQ 28 affinity is now unmanaged
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: IRQ 32 affinity is now unmanaged
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: IRQ 30 affinity is now unmanaged
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 23 10:01:11 np0005626601.novalocal irqbalance[802]: IRQ 29 affinity is now unmanaged
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: *** Squashing the files inside the initramfs done ***
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: *** Creating image file '/boot/initramfs-5.14.0-681.el9.x86_64kdump.img' ***
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: *** Hardlinking files ***
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: Mode:           real
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: Files:          50
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: Linked:         0 files
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: Compared:       0 xattrs
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: Compared:       0 files
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: Saved:          0 B
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: Duration:       0.000540 seconds
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: *** Hardlinking files done ***
Feb 23 10:01:12 np0005626601.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 10:01:12 np0005626601.novalocal dracut[1529]: *** Creating initramfs image file '/boot/initramfs-5.14.0-681.el9.x86_64kdump.img' done ***
Feb 23 10:01:13 np0005626601.novalocal kdumpctl[1030]: kdump: kexec: loaded kdump kernel
Feb 23 10:01:13 np0005626601.novalocal kdumpctl[1030]: kdump: Starting kdump: [OK]
Feb 23 10:01:13 np0005626601.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 23 10:01:13 np0005626601.novalocal systemd[1]: Startup finished in 1.405s (kernel) + 2.362s (initrd) + 13.791s (userspace) = 17.559s.
Feb 23 10:01:21 np0005626601.novalocal sshd-session[4799]: Accepted publickey for zuul from 38.102.83.114 port 52242 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 23 10:01:21 np0005626601.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 23 10:01:21 np0005626601.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 23 10:01:21 np0005626601.novalocal systemd-logind[808]: New session 1 of user zuul.
Feb 23 10:01:21 np0005626601.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 23 10:01:21 np0005626601.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Queued start job for default target Main User Target.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Created slice User Application Slice.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Reached target Paths.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Reached target Timers.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Starting D-Bus User Message Bus Socket...
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Starting Create User's Volatile Files and Directories...
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Finished Create User's Volatile Files and Directories.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Listening on D-Bus User Message Bus Socket.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Reached target Sockets.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Reached target Basic System.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Reached target Main User Target.
Feb 23 10:01:21 np0005626601.novalocal systemd[4803]: Startup finished in 114ms.
Feb 23 10:01:21 np0005626601.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 23 10:01:21 np0005626601.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 23 10:01:21 np0005626601.novalocal sshd-session[4799]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:01:21 np0005626601.novalocal python3[4885]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:01:21 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 23 10:01:21 np0005626601.novalocal irqbalance[802]: IRQ 26 affinity is now unmanaged
Feb 23 10:01:26 np0005626601.novalocal python3[4913]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:01:32 np0005626601.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 10:01:33 np0005626601.novalocal python3[4973]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:01:33 np0005626601.novalocal python3[5013]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 23 10:01:35 np0005626601.novalocal python3[5039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzvManFcsGtormMFrDIX6jHGMdR0S8k12I53mt2OyPQjf4ZFjt+LMfcoVHjh8BY+7LKh0GRUVaq1maRAR+w8LeUFwF150ccb5fqOIFgDZivw6bZenboP1JOUt86yo9ajaxnXOTZ+tjbPERm2/48gmMtTZjvAEg3Ma0rBEms/laajZUbJ0sok44pz7BY9D+nfU7nTehCpC1FzigkNBNAaoAsY25wKKpfzfcvVBczF/fwItupdZa9PhRhsEKhOmYFvbghREe9TDIvztszERYOOf4Tfx1EXsuE0mDBzoAoJ4lX+CDiu8Ix89AQfTxquYWqNpVwZrjY2GKUqOOqWgqhnSi9+ZIq9vKa35MYiQxOy3JEfk74qljUjHB1x9FYgy7dfFeTlbipvPkLCIklBBdfWyJ/Klv1xA6OIQUIgKYE/nbs6jkDOU4nm8EmWPhOURPw1BwStfCtTmVmQMtWUUnKLlL2C2vEkbYssAy0R/2hUlCp4TVFkwovs7tbw6pzmfCnf0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:36 np0005626601.novalocal python3[5063]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:36 np0005626601.novalocal python3[5162]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:01:37 np0005626601.novalocal python3[5233]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771840896.4549167-229-26825720871683/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=422fc1f90ede48598e9cf9286e82ffb9_id_rsa follow=False checksum=5420388ef24ecbef5cf986ada1be6af22bb927f5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:37 np0005626601.novalocal python3[5356]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:01:37 np0005626601.novalocal python3[5427]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771840897.3476744-273-257334320049059/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=422fc1f90ede48598e9cf9286e82ffb9_id_rsa.pub follow=False checksum=7a1c4fd139ddea3be90854a4d4d96b05518fc33e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:39 np0005626601.novalocal python3[5475]: ansible-ping Invoked with data=pong
Feb 23 10:01:40 np0005626601.novalocal python3[5499]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:01:42 np0005626601.novalocal python3[5557]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 23 10:01:43 np0005626601.novalocal python3[5589]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:43 np0005626601.novalocal python3[5613]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:44 np0005626601.novalocal python3[5637]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:44 np0005626601.novalocal python3[5661]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:44 np0005626601.novalocal python3[5685]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:45 np0005626601.novalocal python3[5709]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:47 np0005626601.novalocal sudo[5733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frgrlzbsigtiphtbhdkixqaiizxuilkm ; /usr/bin/python3'
Feb 23 10:01:47 np0005626601.novalocal sudo[5733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:01:47 np0005626601.novalocal python3[5735]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:47 np0005626601.novalocal sudo[5733]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:48 np0005626601.novalocal sudo[5811]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjrkdbhardmhuplbczrggozeumnqugwi ; /usr/bin/python3'
Feb 23 10:01:48 np0005626601.novalocal sudo[5811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:01:48 np0005626601.novalocal python3[5813]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:01:48 np0005626601.novalocal sudo[5811]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:48 np0005626601.novalocal sudo[5884]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnnalojxqalgtdqspgzdfldsfiieepte ; /usr/bin/python3'
Feb 23 10:01:48 np0005626601.novalocal sudo[5884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:01:48 np0005626601.novalocal python3[5886]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771840907.82297-26-90864777834238/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:48 np0005626601.novalocal sudo[5884]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:49 np0005626601.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:49 np0005626601.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:49 np0005626601.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:49 np0005626601.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:50 np0005626601.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:50 np0005626601.novalocal python3[6054]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:50 np0005626601.novalocal python3[6078]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:51 np0005626601.novalocal python3[6102]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:51 np0005626601.novalocal python3[6126]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:51 np0005626601.novalocal python3[6150]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:51 np0005626601.novalocal python3[6174]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:52 np0005626601.novalocal python3[6198]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:52 np0005626601.novalocal python3[6222]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:52 np0005626601.novalocal python3[6246]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:53 np0005626601.novalocal python3[6270]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:53 np0005626601.novalocal python3[6294]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:53 np0005626601.novalocal python3[6318]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:53 np0005626601.novalocal python3[6342]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:54 np0005626601.novalocal python3[6366]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:54 np0005626601.novalocal python3[6390]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:54 np0005626601.novalocal python3[6414]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:54 np0005626601.novalocal python3[6438]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:54 np0005626601.novalocal python3[6462]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:55 np0005626601.novalocal python3[6486]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:55 np0005626601.novalocal python3[6510]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:55 np0005626601.novalocal python3[6534]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:01:58 np0005626601.novalocal sudo[6558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrquzklbspxeyydhriypuufvcwlpevfm ; /usr/bin/python3'
Feb 23 10:01:58 np0005626601.novalocal sudo[6558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:01:58 np0005626601.novalocal python3[6560]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 23 10:01:58 np0005626601.novalocal systemd[1]: Starting Time & Date Service...
Feb 23 10:01:58 np0005626601.novalocal systemd[1]: Started Time & Date Service.
Feb 23 10:01:58 np0005626601.novalocal systemd-timedated[6562]: Changed time zone to 'UTC' (UTC).
Feb 23 10:01:58 np0005626601.novalocal sudo[6558]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:59 np0005626601.novalocal sudo[6589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izidbksneqxqminjahdlqrrymjsvhsul ; /usr/bin/python3'
Feb 23 10:01:59 np0005626601.novalocal sudo[6589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:01:59 np0005626601.novalocal python3[6591]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:01:59 np0005626601.novalocal sudo[6589]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:59 np0005626601.novalocal python3[6667]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:01:59 np0005626601.novalocal python3[6738]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771840919.4384212-202-19527781716701/source _original_basename=tmp_zsvpxd9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:02:00 np0005626601.novalocal python3[6838]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:02:00 np0005626601.novalocal python3[6909]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771840920.266544-242-91042667285424/source _original_basename=tmppl3nef4k follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:02:01 np0005626601.novalocal sudo[7009]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mncikplulvixsogawykayogxjqakaguo ; /usr/bin/python3'
Feb 23 10:02:01 np0005626601.novalocal sudo[7009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:02:01 np0005626601.novalocal python3[7011]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:02:01 np0005626601.novalocal sudo[7009]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:01 np0005626601.novalocal sudo[7082]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuqncowyefkilqmjspbdsnzahehkpwob ; /usr/bin/python3'
Feb 23 10:02:01 np0005626601.novalocal sudo[7082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:02:01 np0005626601.novalocal python3[7084]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771840921.3704321-306-257140249802607/source _original_basename=tmpev7wlmu0 follow=False checksum=39426ec7837fecf5ea9c3f450d9216c750a13535 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:02:01 np0005626601.novalocal sudo[7082]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:02 np0005626601.novalocal python3[7132]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:02:02 np0005626601.novalocal python3[7158]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:02:03 np0005626601.novalocal sudo[7236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxpezcyufyaxlhzfblpgfvtwfhgmdwr ; /usr/bin/python3'
Feb 23 10:02:03 np0005626601.novalocal sudo[7236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:02:03 np0005626601.novalocal python3[7238]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:02:03 np0005626601.novalocal sudo[7236]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:03 np0005626601.novalocal sudo[7309]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwvqmdhimofdwqrggfoinpfgefigarom ; /usr/bin/python3'
Feb 23 10:02:03 np0005626601.novalocal sudo[7309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:02:03 np0005626601.novalocal python3[7311]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771840922.889141-362-73368467030134/source _original_basename=tmp002s_pbw follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:02:03 np0005626601.novalocal sudo[7309]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:03 np0005626601.novalocal sudo[7360]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krggvookuwniggqoxcawgjztggmogbyk ; /usr/bin/python3'
Feb 23 10:02:03 np0005626601.novalocal sudo[7360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:02:04 np0005626601.novalocal python3[7362]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-f315-7186-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:02:04 np0005626601.novalocal sudo[7360]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:04 np0005626601.novalocal python3[7390]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-f315-7186-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 23 10:02:06 np0005626601.novalocal python3[7418]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:02:25 np0005626601.novalocal sudo[7442]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orvnpesevuiipkigkptndqlqspmayqjx ; /usr/bin/python3'
Feb 23 10:02:25 np0005626601.novalocal sudo[7442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:02:25 np0005626601.novalocal python3[7444]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:02:25 np0005626601.novalocal sudo[7442]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:28 np0005626601.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 23 10:03:00 np0005626601.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 23 10:03:00 np0005626601.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4713] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 23 10:03:00 np0005626601.novalocal systemd-udevd[7448]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4868] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4889] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4891] device (eth1): carrier: link connected
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4892] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4896] policy: auto-activating connection 'Wired connection 1' (a3dcef92-ee93-3e7f-aa26-1790b6720768)
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4899] device (eth1): Activation: starting connection 'Wired connection 1' (a3dcef92-ee93-3e7f-aa26-1790b6720768)
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4900] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4901] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4904] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:03:00 np0005626601.novalocal NetworkManager[869]: <info>  [1771840980.4907] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:03:01 np0005626601.novalocal python3[7474]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-b57d-8f17-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:03:07 np0005626601.novalocal sudo[7552]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpmjqjaxldurlpjkboceugtnfkbkhzhw ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 23 10:03:07 np0005626601.novalocal sudo[7552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:03:08 np0005626601.novalocal python3[7554]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:03:08 np0005626601.novalocal sudo[7552]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:08 np0005626601.novalocal sudo[7625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdccrizmgnrpffbpczeghiogqbuibxui ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 23 10:03:08 np0005626601.novalocal sudo[7625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:03:08 np0005626601.novalocal python3[7627]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771840987.858472-103-110696586804438/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=5f2468b1d9c4a7a9b13ac610f7ebe699fb0964f2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:03:08 np0005626601.novalocal sudo[7625]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:08 np0005626601.novalocal sudo[7675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juxrhbxrzopmcjjzbwcvhxsmbdkffyns ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 23 10:03:08 np0005626601.novalocal sudo[7675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:03:09 np0005626601.novalocal python3[7677]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2377] caught SIGTERM, shutting down normally.
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Stopping Network Manager...
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2390] dhcp4 (eth0): canceled DHCP transaction
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2391] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2392] dhcp4 (eth0): state changed no lease
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2396] manager: NetworkManager state is now CONNECTING
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2577] dhcp4 (eth1): canceled DHCP transaction
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2578] dhcp4 (eth1): state changed no lease
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[869]: <info>  [1771840989.2635] exiting (success)
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Stopped Network Manager.
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Starting Network Manager...
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3172] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:9e288ccf-ad11-4627-abc5-9df48b7c9713)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3175] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3222] manager[0x55ad954a1000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Starting Hostname Service...
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Started Hostname Service.
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3934] hostname: hostname: using hostnamed
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3937] hostname: static hostname changed from (none) to "np0005626601.novalocal"
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3941] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3946] manager[0x55ad954a1000]: rfkill: Wi-Fi hardware radio set enabled
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3947] manager[0x55ad954a1000]: rfkill: WWAN hardware radio set enabled
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3973] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3974] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3974] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3975] manager: Networking is enabled by state file
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3977] settings: Loaded settings plugin: keyfile (internal)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.3980] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4004] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4013] dhcp: init: Using DHCP client 'internal'
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4016] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4021] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4026] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4033] device (lo): Activation: starting connection 'lo' (c3e17fe3-3502-4aa2-b43f-fdb973524017)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4039] device (eth0): carrier: link connected
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4042] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4047] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4048] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4054] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4060] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4065] device (eth1): carrier: link connected
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4068] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4073] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (a3dcef92-ee93-3e7f-aa26-1790b6720768) (indicated)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4073] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4079] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4085] device (eth1): Activation: starting connection 'Wired connection 1' (a3dcef92-ee93-3e7f-aa26-1790b6720768)
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Started Network Manager.
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4091] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4095] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4098] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4100] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4102] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4106] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4109] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4111] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4116] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4122] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4124] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4131] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4133] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4150] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4152] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4157] device (lo): Activation: successful, device activated.
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4163] dhcp4 (eth0): state changed new lease, address=38.102.83.199
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4171] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4242] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4277] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4279] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4282] manager: NetworkManager state is now CONNECTED_SITE
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4285] device (eth0): Activation: successful, device activated.
Feb 23 10:03:09 np0005626601.novalocal NetworkManager[7689]: <info>  [1771840989.4289] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 23 10:03:09 np0005626601.novalocal sudo[7675]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:09 np0005626601.novalocal python3[7761]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-b57d-8f17-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:03:19 np0005626601.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 10:03:27 np0005626601.novalocal systemd[4803]: Starting Mark boot as successful...
Feb 23 10:03:27 np0005626601.novalocal systemd[4803]: Finished Mark boot as successful.
Feb 23 10:03:39 np0005626601.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.7795] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 23 10:03:54 np0005626601.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 10:03:54 np0005626601.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8044] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8046] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8051] device (eth1): Activation: successful, device activated.
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8057] manager: startup complete
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8058] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <warn>  [1771841034.8061] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8065] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8131] dhcp4 (eth1): canceled DHCP transaction
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8131] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8131] dhcp4 (eth1): state changed no lease
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8153] policy: auto-activating connection 'ci-private-network' (2157fe3c-5bc1-52fd-87e4-427c4a0eb10c)
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8162] device (eth1): Activation: starting connection 'ci-private-network' (2157fe3c-5bc1-52fd-87e4-427c4a0eb10c)
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8163] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8168] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8179] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8190] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8235] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8238] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:03:54 np0005626601.novalocal NetworkManager[7689]: <info>  [1771841034.8245] device (eth1): Activation: successful, device activated.
Feb 23 10:04:04 np0005626601.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 10:04:09 np0005626601.novalocal sshd-session[4812]: Received disconnect from 38.102.83.114 port 52242:11: disconnected by user
Feb 23 10:04:09 np0005626601.novalocal sshd-session[4812]: Disconnected from user zuul 38.102.83.114 port 52242
Feb 23 10:04:09 np0005626601.novalocal sshd-session[4799]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:04:09 np0005626601.novalocal systemd-logind[808]: Session 1 logged out. Waiting for processes to exit.
Feb 23 10:04:34 np0005626601.novalocal sshd-session[7790]: Accepted publickey for zuul from 38.102.83.114 port 57350 ssh2: RSA SHA256:BiJLa2SOE3eAuOQ4B+aHpprkSCedZrO5BRA4A2P+Trc
Feb 23 10:04:34 np0005626601.novalocal systemd-logind[808]: New session 3 of user zuul.
Feb 23 10:04:34 np0005626601.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 23 10:04:34 np0005626601.novalocal sshd-session[7790]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:04:35 np0005626601.novalocal sudo[7869]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcwfxdriwonmppezcgxwkftytiofwiel ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 23 10:04:35 np0005626601.novalocal sudo[7869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:04:35 np0005626601.novalocal python3[7871]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:04:35 np0005626601.novalocal sudo[7869]: pam_unix(sudo:session): session closed for user root
Feb 23 10:04:35 np0005626601.novalocal sudo[7942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hygucmacyxbqmntuccqjzzwivtszirbr ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 23 10:04:35 np0005626601.novalocal sudo[7942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:04:35 np0005626601.novalocal python3[7944]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771841074.875828-312-104205162799069/source _original_basename=tmpket7tsns follow=False checksum=41c0c49b3d8563d29bc0e6daceb6efdf63da9451 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:04:35 np0005626601.novalocal sudo[7942]: pam_unix(sudo:session): session closed for user root
Feb 23 10:04:38 np0005626601.novalocal sshd-session[7793]: Connection closed by 38.102.83.114 port 57350
Feb 23 10:04:38 np0005626601.novalocal sshd-session[7790]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:04:38 np0005626601.novalocal systemd-logind[808]: Session 3 logged out. Waiting for processes to exit.
Feb 23 10:04:38 np0005626601.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 23 10:04:38 np0005626601.novalocal systemd-logind[808]: Removed session 3.
Feb 23 10:06:27 np0005626601.novalocal systemd[4803]: Created slice User Background Tasks Slice.
Feb 23 10:06:27 np0005626601.novalocal systemd[4803]: Starting Cleanup of User's Temporary Files and Directories...
Feb 23 10:06:27 np0005626601.novalocal systemd[4803]: Finished Cleanup of User's Temporary Files and Directories.
Feb 23 10:07:08 np0005626601.novalocal sshd-session[7973]: Invalid user admin from 80.94.95.116 port 32540
Feb 23 10:07:08 np0005626601.novalocal sshd-session[7973]: Connection closed by invalid user admin 80.94.95.116 port 32540 [preauth]
Feb 23 10:08:12 np0005626601.novalocal sshd-session[7975]: Connection closed by authenticating user root 157.20.215.3 port 36722 [preauth]
Feb 23 10:08:14 np0005626601.novalocal sshd-session[7977]: Connection closed by authenticating user root 157.20.215.3 port 36736 [preauth]
Feb 23 10:08:16 np0005626601.novalocal sshd-session[7979]: Connection closed by authenticating user root 157.20.215.3 port 36748 [preauth]
Feb 23 10:08:19 np0005626601.novalocal sshd-session[7981]: Connection closed by authenticating user root 157.20.215.3 port 51584 [preauth]
Feb 23 10:08:21 np0005626601.novalocal sshd-session[7983]: Connection closed by authenticating user root 157.20.215.3 port 51600 [preauth]
Feb 23 10:08:24 np0005626601.novalocal sshd-session[7985]: Connection closed by authenticating user root 157.20.215.3 port 51610 [preauth]
Feb 23 10:08:26 np0005626601.novalocal sshd-session[7987]: Connection closed by authenticating user root 157.20.215.3 port 51626 [preauth]
Feb 23 10:08:28 np0005626601.novalocal sshd-session[7989]: Connection closed by authenticating user root 157.20.215.3 port 43398 [preauth]
Feb 23 10:08:30 np0005626601.novalocal sshd-session[7991]: Connection closed by authenticating user root 157.20.215.3 port 43406 [preauth]
Feb 23 10:08:32 np0005626601.novalocal sshd-session[7993]: Connection closed by authenticating user root 157.20.215.3 port 43416 [preauth]
Feb 23 10:08:34 np0005626601.novalocal sshd-session[7995]: Connection closed by authenticating user root 157.20.215.3 port 43418 [preauth]
Feb 23 10:08:36 np0005626601.novalocal sshd-session[7997]: Connection closed by authenticating user root 157.20.215.3 port 43424 [preauth]
Feb 23 10:08:38 np0005626601.novalocal sshd-session[7999]: Connection closed by authenticating user root 157.20.215.3 port 40034 [preauth]
Feb 23 10:08:41 np0005626601.novalocal sshd-session[8001]: Connection closed by authenticating user root 157.20.215.3 port 40044 [preauth]
Feb 23 10:08:43 np0005626601.novalocal sshd-session[8003]: Connection closed by authenticating user root 157.20.215.3 port 40060 [preauth]
Feb 23 10:08:45 np0005626601.novalocal sshd-session[8005]: Connection closed by authenticating user root 157.20.215.3 port 40072 [preauth]
Feb 23 10:08:48 np0005626601.novalocal sshd-session[8007]: Connection closed by authenticating user root 157.20.215.3 port 35868 [preauth]
Feb 23 10:08:50 np0005626601.novalocal sshd-session[8009]: Connection closed by authenticating user root 157.20.215.3 port 35882 [preauth]
Feb 23 10:08:52 np0005626601.novalocal sshd-session[8011]: Connection closed by authenticating user root 157.20.215.3 port 35896 [preauth]
Feb 23 10:08:54 np0005626601.novalocal sshd-session[8013]: Connection closed by authenticating user root 157.20.215.3 port 35900 [preauth]
Feb 23 10:08:57 np0005626601.novalocal sshd-session[8015]: Connection closed by authenticating user root 157.20.215.3 port 35914 [preauth]
Feb 23 10:08:59 np0005626601.novalocal sshd-session[8017]: Connection closed by authenticating user root 157.20.215.3 port 56688 [preauth]
Feb 23 10:09:01 np0005626601.novalocal sshd-session[8019]: Connection closed by authenticating user root 157.20.215.3 port 56690 [preauth]
Feb 23 10:09:03 np0005626601.novalocal sshd-session[8021]: Connection closed by authenticating user root 157.20.215.3 port 56700 [preauth]
Feb 23 10:09:05 np0005626601.novalocal sshd-session[8023]: Connection closed by authenticating user root 157.20.215.3 port 56712 [preauth]
Feb 23 10:09:08 np0005626601.novalocal sshd-session[8025]: Connection closed by authenticating user root 157.20.215.3 port 51886 [preauth]
Feb 23 10:09:10 np0005626601.novalocal sshd-session[8027]: Connection closed by authenticating user root 157.20.215.3 port 51894 [preauth]
Feb 23 10:09:12 np0005626601.novalocal sshd-session[8029]: Connection closed by authenticating user root 157.20.215.3 port 51910 [preauth]
Feb 23 10:09:14 np0005626601.novalocal sshd-session[8031]: Connection closed by authenticating user root 157.20.215.3 port 51920 [preauth]
Feb 23 10:09:16 np0005626601.novalocal sshd-session[8033]: Connection closed by authenticating user root 157.20.215.3 port 51924 [preauth]
Feb 23 10:09:18 np0005626601.novalocal sshd-session[8035]: Connection closed by authenticating user root 157.20.215.3 port 43386 [preauth]
Feb 23 10:09:21 np0005626601.novalocal sshd-session[8037]: Connection closed by authenticating user root 157.20.215.3 port 43400 [preauth]
Feb 23 10:09:23 np0005626601.novalocal sshd-session[8039]: Connection closed by authenticating user root 157.20.215.3 port 43408 [preauth]
Feb 23 10:09:25 np0005626601.novalocal sshd-session[8041]: Connection closed by authenticating user root 157.20.215.3 port 43412 [preauth]
Feb 23 10:09:28 np0005626601.novalocal sshd-session[8043]: Connection closed by authenticating user root 157.20.215.3 port 43216 [preauth]
Feb 23 10:09:30 np0005626601.novalocal sshd-session[8045]: Connection closed by authenticating user root 157.20.215.3 port 43222 [preauth]
Feb 23 10:09:32 np0005626601.novalocal sshd-session[8047]: Connection closed by authenticating user root 157.20.215.3 port 43224 [preauth]
Feb 23 10:09:36 np0005626601.novalocal sshd-session[8049]: Connection closed by authenticating user root 157.20.215.3 port 43228 [preauth]
Feb 23 10:09:38 np0005626601.novalocal sshd-session[8051]: Connection closed by authenticating user root 157.20.215.3 port 44282 [preauth]
Feb 23 10:09:40 np0005626601.novalocal sshd-session[8053]: Connection closed by authenticating user root 157.20.215.3 port 44286 [preauth]
Feb 23 10:09:42 np0005626601.novalocal sshd-session[8055]: Connection closed by authenticating user root 157.20.215.3 port 44294 [preauth]
Feb 23 10:09:44 np0005626601.novalocal sshd-session[8057]: Connection closed by authenticating user root 157.20.215.3 port 44298 [preauth]
Feb 23 10:09:46 np0005626601.novalocal sshd-session[8059]: Connection closed by authenticating user root 157.20.215.3 port 44310 [preauth]
Feb 23 10:09:49 np0005626601.novalocal sshd-session[8061]: Connection closed by authenticating user root 157.20.215.3 port 48744 [preauth]
Feb 23 10:09:51 np0005626601.novalocal sshd-session[8063]: Connection closed by authenticating user root 157.20.215.3 port 48750 [preauth]
Feb 23 10:09:53 np0005626601.novalocal sshd-session[8065]: Connection closed by authenticating user root 157.20.215.3 port 48764 [preauth]
Feb 23 10:09:56 np0005626601.novalocal sshd-session[8067]: Connection closed by authenticating user root 157.20.215.3 port 48772 [preauth]
Feb 23 10:09:58 np0005626601.novalocal sshd-session[8069]: Connection closed by authenticating user root 157.20.215.3 port 57240 [preauth]
Feb 23 10:10:00 np0005626601.novalocal sshd-session[8071]: Connection closed by authenticating user root 157.20.215.3 port 57248 [preauth]
Feb 23 10:10:03 np0005626601.novalocal sshd-session[8073]: Connection closed by authenticating user root 157.20.215.3 port 57256 [preauth]
Feb 23 10:10:05 np0005626601.novalocal sshd-session[8075]: Connection closed by authenticating user root 157.20.215.3 port 57270 [preauth]
Feb 23 10:10:07 np0005626601.novalocal sshd-session[8077]: Connection closed by authenticating user root 157.20.215.3 port 57278 [preauth]
Feb 23 10:10:09 np0005626601.novalocal sshd-session[8079]: Connection closed by authenticating user root 157.20.215.3 port 60022 [preauth]
Feb 23 10:10:11 np0005626601.novalocal sshd-session[8081]: Connection closed by authenticating user root 157.20.215.3 port 60026 [preauth]
Feb 23 10:10:14 np0005626601.novalocal sshd-session[8083]: Connection closed by authenticating user root 157.20.215.3 port 60038 [preauth]
Feb 23 10:10:16 np0005626601.novalocal sshd-session[8085]: Connection closed by authenticating user root 157.20.215.3 port 60050 [preauth]
Feb 23 10:10:18 np0005626601.novalocal sshd-session[8087]: Connection closed by authenticating user root 157.20.215.3 port 52898 [preauth]
Feb 23 10:10:20 np0005626601.novalocal sshd-session[8089]: Connection closed by authenticating user root 157.20.215.3 port 52900 [preauth]
Feb 23 10:10:22 np0005626601.novalocal sshd-session[8091]: Connection closed by authenticating user root 157.20.215.3 port 52908 [preauth]
Feb 23 10:10:23 np0005626601.novalocal sshd-session[8093]: Connection closed by 143.198.30.3 port 35956
Feb 23 10:10:25 np0005626601.novalocal sshd-session[8094]: Connection closed by authenticating user root 157.20.215.3 port 52922 [preauth]
Feb 23 10:10:27 np0005626601.novalocal sshd-session[8096]: Connection closed by authenticating user root 157.20.215.3 port 52926 [preauth]
Feb 23 10:10:29 np0005626601.novalocal sshd-session[8098]: Connection closed by authenticating user root 157.20.215.3 port 33218 [preauth]
Feb 23 10:10:31 np0005626601.novalocal sshd-session[8100]: Connection closed by authenticating user root 157.20.215.3 port 33222 [preauth]
Feb 23 10:10:33 np0005626601.novalocal sshd-session[8102]: Connection closed by authenticating user root 157.20.215.3 port 33238 [preauth]
Feb 23 10:10:36 np0005626601.novalocal sshd-session[8104]: Connection closed by authenticating user root 157.20.215.3 port 33242 [preauth]
Feb 23 10:10:38 np0005626601.novalocal sshd-session[8106]: Connection closed by authenticating user root 157.20.215.3 port 36690 [preauth]
Feb 23 10:10:40 np0005626601.novalocal sshd-session[8108]: Connection closed by authenticating user root 157.20.215.3 port 36696 [preauth]
Feb 23 10:10:42 np0005626601.novalocal sshd-session[8110]: Connection closed by authenticating user root 157.20.215.3 port 36708 [preauth]
Feb 23 10:10:44 np0005626601.novalocal sshd-session[8112]: Connection closed by authenticating user root 157.20.215.3 port 36710 [preauth]
Feb 23 10:10:46 np0005626601.novalocal sshd-session[8114]: Connection closed by authenticating user root 157.20.215.3 port 36726 [preauth]
Feb 23 10:10:48 np0005626601.novalocal sshd-session[8116]: Connection closed by authenticating user root 157.20.215.3 port 51948 [preauth]
Feb 23 10:10:50 np0005626601.novalocal sshd-session[8118]: Connection closed by authenticating user root 157.20.215.3 port 51980 [preauth]
Feb 23 10:10:53 np0005626601.novalocal sshd-session[8120]: Connection closed by authenticating user root 157.20.215.3 port 51982 [preauth]
Feb 23 10:10:55 np0005626601.novalocal sshd-session[8123]: Connection closed by authenticating user root 157.20.215.3 port 51992 [preauth]
Feb 23 10:10:58 np0005626601.novalocal sshd-session[8125]: Connection closed by authenticating user root 157.20.215.3 port 52010 [preauth]
Feb 23 10:11:00 np0005626601.novalocal sshd-session[8127]: Connection closed by authenticating user root 157.20.215.3 port 58740 [preauth]
Feb 23 10:11:02 np0005626601.novalocal sshd-session[8129]: Connection closed by authenticating user root 157.20.215.3 port 58754 [preauth]
Feb 23 10:11:04 np0005626601.novalocal sshd-session[8131]: Connection closed by authenticating user root 157.20.215.3 port 58770 [preauth]
Feb 23 10:11:06 np0005626601.novalocal sshd-session[8133]: Connection closed by authenticating user root 157.20.215.3 port 58786 [preauth]
Feb 23 10:11:09 np0005626601.novalocal sshd-session[8135]: Connection closed by authenticating user root 157.20.215.3 port 59814 [preauth]
Feb 23 10:11:11 np0005626601.novalocal sshd-session[8137]: Connection closed by authenticating user root 157.20.215.3 port 59816 [preauth]
Feb 23 10:11:13 np0005626601.novalocal sshd-session[8139]: Connection closed by authenticating user root 157.20.215.3 port 59820 [preauth]
Feb 23 10:11:15 np0005626601.novalocal sshd-session[8141]: Invalid user user from 157.20.215.3 port 59830
Feb 23 10:11:16 np0005626601.novalocal sshd-session[8141]: Connection closed by invalid user user 157.20.215.3 port 59830 [preauth]
Feb 23 10:11:17 np0005626601.novalocal sshd-session[8143]: Invalid user user from 157.20.215.3 port 48214
Feb 23 10:11:18 np0005626601.novalocal sshd-session[8146]: Accepted publickey for zuul from 38.102.83.114 port 55422 ssh2: RSA SHA256:BiJLa2SOE3eAuOQ4B+aHpprkSCedZrO5BRA4A2P+Trc
Feb 23 10:11:18 np0005626601.novalocal systemd-logind[808]: New session 4 of user zuul.
Feb 23 10:11:18 np0005626601.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 23 10:11:18 np0005626601.novalocal sshd-session[8146]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:11:18 np0005626601.novalocal sudo[8173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwqntvpupojgusukrqpvqunwfrnezdjg ; /usr/bin/python3'
Feb 23 10:11:18 np0005626601.novalocal sudo[8173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:18 np0005626601.novalocal sshd-session[8143]: Connection closed by invalid user user 157.20.215.3 port 48214 [preauth]
Feb 23 10:11:18 np0005626601.novalocal python3[8175]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-6b14-18de-00000000216e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:11:18 np0005626601.novalocal sudo[8173]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:18 np0005626601.novalocal sudo[8204]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgpbyqaiyhhysewfuvhwoevilzrxdsga ; /usr/bin/python3'
Feb 23 10:11:18 np0005626601.novalocal sudo[8204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:18 np0005626601.novalocal python3[8206]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:11:18 np0005626601.novalocal sudo[8204]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:18 np0005626601.novalocal sudo[8230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnpaeuvkivlcehswrdvlfquvugobebhu ; /usr/bin/python3'
Feb 23 10:11:18 np0005626601.novalocal sudo[8230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:18 np0005626601.novalocal python3[8232]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:11:18 np0005626601.novalocal sudo[8230]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:18 np0005626601.novalocal sudo[8256]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuwctjqfofncwdmhxfypldqdcfmerzma ; /usr/bin/python3'
Feb 23 10:11:18 np0005626601.novalocal sudo[8256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:19 np0005626601.novalocal python3[8258]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:11:19 np0005626601.novalocal sudo[8256]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:19 np0005626601.novalocal sudo[8282]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioxyiqfqbtnzptucjusyngomcryydnro ; /usr/bin/python3'
Feb 23 10:11:19 np0005626601.novalocal sudo[8282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:19 np0005626601.novalocal python3[8284]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:11:19 np0005626601.novalocal sudo[8282]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:19 np0005626601.novalocal sudo[8308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdosscjiqnxeufvvycejpmiftpurdoxn ; /usr/bin/python3'
Feb 23 10:11:19 np0005626601.novalocal sudo[8308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:19 np0005626601.novalocal python3[8310]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:11:20 np0005626601.novalocal sudo[8308]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:20 np0005626601.novalocal sshd-session[8198]: Invalid user user from 157.20.215.3 port 48216
Feb 23 10:11:20 np0005626601.novalocal sudo[8386]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qprmfortxdhqyzhgnetusvadnwncahar ; /usr/bin/python3'
Feb 23 10:11:20 np0005626601.novalocal sudo[8386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:20 np0005626601.novalocal python3[8388]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:11:20 np0005626601.novalocal sshd-session[8198]: Connection closed by invalid user user 157.20.215.3 port 48216 [preauth]
Feb 23 10:11:20 np0005626601.novalocal sudo[8386]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:20 np0005626601.novalocal sudo[8459]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgqorxtrzvsiqnmhihvqncngslflrpwd ; /usr/bin/python3'
Feb 23 10:11:20 np0005626601.novalocal sudo[8459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:20 np0005626601.novalocal python3[8461]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771841480.2446413-522-218277216523583/source _original_basename=tmpgsgqicx0 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:11:20 np0005626601.novalocal sudo[8459]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:21 np0005626601.novalocal sudo[8511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elhwgpiyuzgfpqyckaetooathichsqyi ; /usr/bin/python3'
Feb 23 10:11:21 np0005626601.novalocal sudo[8511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:21 np0005626601.novalocal python3[8513]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:11:21 np0005626601.novalocal systemd[1]: Reloading.
Feb 23 10:11:21 np0005626601.novalocal systemd-rc-local-generator[8530]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:11:21 np0005626601.novalocal sudo[8511]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:22 np0005626601.novalocal sshd-session[8462]: Invalid user user from 157.20.215.3 port 48232
Feb 23 10:11:22 np0005626601.novalocal sshd-session[8462]: Connection closed by invalid user user 157.20.215.3 port 48232 [preauth]
Feb 23 10:11:23 np0005626601.novalocal sudo[8575]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktcfjunmczgyaxvaowbdrgioqaakpcgq ; /usr/bin/python3'
Feb 23 10:11:23 np0005626601.novalocal sudo[8575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:23 np0005626601.novalocal python3[8577]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 23 10:11:23 np0005626601.novalocal sudo[8575]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:23 np0005626601.novalocal sudo[8601]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anlwcqguamyyctsqgvgnyjkseezvoljj ; /usr/bin/python3'
Feb 23 10:11:23 np0005626601.novalocal sudo[8601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:23 np0005626601.novalocal python3[8603]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:11:23 np0005626601.novalocal sudo[8601]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:23 np0005626601.novalocal sudo[8629]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhmntmecmimhdeqniougfaltospfyrkw ; /usr/bin/python3'
Feb 23 10:11:23 np0005626601.novalocal sudo[8629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:24 np0005626601.novalocal python3[8631]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:11:24 np0005626601.novalocal sudo[8629]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:24 np0005626601.novalocal sudo[8657]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjpjxcaxatnkxelmemhjnvqfncnktqjz ; /usr/bin/python3'
Feb 23 10:11:24 np0005626601.novalocal sudo[8657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:24 np0005626601.novalocal python3[8659]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:11:24 np0005626601.novalocal sudo[8657]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:24 np0005626601.novalocal sudo[8685]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcmdmqnwupzmjrjhjardtknstbdefqma ; /usr/bin/python3'
Feb 23 10:11:24 np0005626601.novalocal sudo[8685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:24 np0005626601.novalocal sshd-session[8550]: Invalid user user from 157.20.215.3 port 48238
Feb 23 10:11:24 np0005626601.novalocal python3[8687]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:11:24 np0005626601.novalocal sudo[8685]: pam_unix(sudo:session): session closed for user root
Feb 23 10:11:24 np0005626601.novalocal sshd-session[8550]: Connection closed by invalid user user 157.20.215.3 port 48238 [preauth]
Feb 23 10:11:25 np0005626601.novalocal python3[8716]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-6b14-18de-000000002175-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:11:25 np0005626601.novalocal python3[8746]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 10:11:26 np0005626601.novalocal sshd-session[8691]: Invalid user user from 157.20.215.3 port 48250
Feb 23 10:11:26 np0005626601.novalocal sshd-session[8691]: Connection closed by invalid user user 157.20.215.3 port 48250 [preauth]
Feb 23 10:11:28 np0005626601.novalocal sshd-session[8149]: Connection closed by 38.102.83.114 port 55422
Feb 23 10:11:28 np0005626601.novalocal sshd-session[8146]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:11:28 np0005626601.novalocal systemd-logind[808]: Session 4 logged out. Waiting for processes to exit.
Feb 23 10:11:28 np0005626601.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 23 10:11:28 np0005626601.novalocal systemd[1]: session-4.scope: Consumed 3.340s CPU time.
Feb 23 10:11:28 np0005626601.novalocal systemd-logind[808]: Removed session 4.
Feb 23 10:11:28 np0005626601.novalocal sshd-session[8750]: Invalid user user from 157.20.215.3 port 36712
Feb 23 10:11:29 np0005626601.novalocal sshd-session[8750]: Connection closed by invalid user user 157.20.215.3 port 36712 [preauth]
Feb 23 10:11:30 np0005626601.novalocal sshd-session[8756]: Accepted publickey for zuul from 38.102.83.114 port 37024 ssh2: RSA SHA256:BiJLa2SOE3eAuOQ4B+aHpprkSCedZrO5BRA4A2P+Trc
Feb 23 10:11:30 np0005626601.novalocal systemd-logind[808]: New session 5 of user zuul.
Feb 23 10:11:30 np0005626601.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 23 10:11:30 np0005626601.novalocal sshd-session[8756]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:11:30 np0005626601.novalocal sudo[8783]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icvnoiruxkvrsxgpmzkybhujnanmbqmh ; /usr/bin/python3'
Feb 23 10:11:30 np0005626601.novalocal sudo[8783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:11:30 np0005626601.novalocal python3[8785]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 10:11:31 np0005626601.novalocal sshd-session[8754]: Invalid user user from 157.20.215.3 port 36718
Feb 23 10:11:31 np0005626601.novalocal sshd-session[8754]: Connection closed by invalid user user 157.20.215.3 port 36718 [preauth]
Feb 23 10:11:33 np0005626601.novalocal sshd-session[8790]: Invalid user user from 157.20.215.3 port 36724
Feb 23 10:11:33 np0005626601.novalocal sshd-session[8790]: Connection closed by invalid user user 157.20.215.3 port 36724 [preauth]
Feb 23 10:11:35 np0005626601.novalocal sshd-session[8813]: Invalid user user from 157.20.215.3 port 36736
Feb 23 10:11:35 np0005626601.novalocal sshd-session[8813]: Connection closed by invalid user user 157.20.215.3 port 36736 [preauth]
Feb 23 10:11:36 np0005626601.novalocal setsebool[8829]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 23 10:11:36 np0005626601.novalocal setsebool[8829]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 23 10:11:37 np0005626601.novalocal sshd-session[8830]: Invalid user user from 157.20.215.3 port 44492
Feb 23 10:11:38 np0005626601.novalocal sshd-session[8830]: Connection closed by invalid user user 157.20.215.3 port 44492 [preauth]
Feb 23 10:11:39 np0005626601.novalocal sshd-session[8839]: Invalid user user from 157.20.215.3 port 44494
Feb 23 10:11:40 np0005626601.novalocal sshd-session[8839]: Connection closed by invalid user user 157.20.215.3 port 44494 [preauth]
Feb 23 10:11:42 np0005626601.novalocal sshd-session[8842]: Invalid user user from 157.20.215.3 port 44506
Feb 23 10:11:42 np0005626601.novalocal sshd-session[8842]: Connection closed by invalid user user 157.20.215.3 port 44506 [preauth]
Feb 23 10:11:44 np0005626601.novalocal sshd-session[8844]: Invalid user user from 157.20.215.3 port 44510
Feb 23 10:11:44 np0005626601.novalocal sshd-session[8844]: Connection closed by invalid user user 157.20.215.3 port 44510 [preauth]
Feb 23 10:11:46 np0005626601.novalocal sshd-session[8846]: Invalid user user from 157.20.215.3 port 44526
Feb 23 10:11:47 np0005626601.novalocal sshd-session[8846]: Connection closed by invalid user user 157.20.215.3 port 44526 [preauth]
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:11:48 np0005626601.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:11:49 np0005626601.novalocal sshd-session[8848]: Invalid user user from 157.20.215.3 port 46236
Feb 23 10:11:49 np0005626601.novalocal sshd-session[8848]: Connection closed by invalid user user 157.20.215.3 port 46236 [preauth]
Feb 23 10:11:51 np0005626601.novalocal sshd-session[8868]: Invalid user user from 157.20.215.3 port 46244
Feb 23 10:11:51 np0005626601.novalocal sshd-session[8868]: Connection closed by invalid user user 157.20.215.3 port 46244 [preauth]
Feb 23 10:11:53 np0005626601.novalocal sshd-session[8870]: Invalid user user from 157.20.215.3 port 46246
Feb 23 10:11:53 np0005626601.novalocal sshd-session[8870]: Connection closed by invalid user user 157.20.215.3 port 46246 [preauth]
Feb 23 10:11:55 np0005626601.novalocal sshd-session[8872]: Invalid user user from 157.20.215.3 port 46258
Feb 23 10:11:55 np0005626601.novalocal sshd-session[8872]: Connection closed by invalid user user 157.20.215.3 port 46258 [preauth]
Feb 23 10:11:57 np0005626601.novalocal sshd-session[8874]: Invalid user user from 157.20.215.3 port 60798
Feb 23 10:11:58 np0005626601.novalocal sshd-session[8874]: Connection closed by invalid user user 157.20.215.3 port 60798 [preauth]
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  Converting 388 SID table entries...
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:11:58 np0005626601.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:12:00 np0005626601.novalocal sshd-session[8882]: Invalid user user from 157.20.215.3 port 60808
Feb 23 10:12:00 np0005626601.novalocal sshd-session[8882]: Connection closed by invalid user user 157.20.215.3 port 60808 [preauth]
Feb 23 10:12:02 np0005626601.novalocal sshd-session[9604]: Invalid user user from 157.20.215.3 port 60814
Feb 23 10:12:02 np0005626601.novalocal sshd-session[9604]: Connection closed by invalid user user 157.20.215.3 port 60814 [preauth]
Feb 23 10:12:04 np0005626601.novalocal sshd-session[9606]: Invalid user user from 157.20.215.3 port 60826
Feb 23 10:12:04 np0005626601.novalocal sshd-session[9606]: Connection closed by invalid user user 157.20.215.3 port 60826 [preauth]
Feb 23 10:12:06 np0005626601.novalocal sshd-session[9608]: Invalid user user from 157.20.215.3 port 60832
Feb 23 10:12:06 np0005626601.novalocal sshd-session[9608]: Connection closed by invalid user user 157.20.215.3 port 60832 [preauth]
Feb 23 10:12:08 np0005626601.novalocal sshd-session[9610]: Invalid user user from 157.20.215.3 port 40048
Feb 23 10:12:08 np0005626601.novalocal sshd-session[9610]: Connection closed by invalid user user 157.20.215.3 port 40048 [preauth]
Feb 23 10:12:10 np0005626601.novalocal sshd-session[9612]: Invalid user user from 157.20.215.3 port 40054
Feb 23 10:12:10 np0005626601.novalocal sshd-session[9612]: Connection closed by invalid user user 157.20.215.3 port 40054 [preauth]
Feb 23 10:12:12 np0005626601.novalocal sshd-session[9614]: Invalid user user from 157.20.215.3 port 40062
Feb 23 10:12:12 np0005626601.novalocal sshd-session[9614]: Connection closed by invalid user user 157.20.215.3 port 40062 [preauth]
Feb 23 10:12:14 np0005626601.novalocal sshd-session[9616]: Invalid user user from 157.20.215.3 port 40072
Feb 23 10:12:14 np0005626601.novalocal sshd-session[9616]: Connection closed by invalid user user 157.20.215.3 port 40072 [preauth]
Feb 23 10:12:16 np0005626601.novalocal dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 23 10:12:16 np0005626601.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:12:16 np0005626601.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:12:16 np0005626601.novalocal systemd[1]: Reloading.
Feb 23 10:12:16 np0005626601.novalocal systemd-rc-local-generator[9658]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:12:16 np0005626601.novalocal sshd-session[9618]: Invalid user user from 157.20.215.3 port 40080
Feb 23 10:12:16 np0005626601.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:12:17 np0005626601.novalocal sshd-session[9618]: Connection closed by invalid user user 157.20.215.3 port 40080 [preauth]
Feb 23 10:12:17 np0005626601.novalocal sudo[8783]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:19 np0005626601.novalocal sshd-session[10575]: Invalid user user from 157.20.215.3 port 36202
Feb 23 10:12:19 np0005626601.novalocal sshd-session[10575]: Connection closed by invalid user user 157.20.215.3 port 36202 [preauth]
Feb 23 10:12:20 np0005626601.novalocal python3[13788]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-7f91-1b0d-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:12:21 np0005626601.novalocal sshd-session[13078]: Invalid user user from 157.20.215.3 port 36214
Feb 23 10:12:21 np0005626601.novalocal sshd-session[13078]: Connection closed by invalid user user 157.20.215.3 port 36214 [preauth]
Feb 23 10:12:21 np0005626601.novalocal kernel: evm: overlay not supported
Feb 23 10:12:21 np0005626601.novalocal systemd[4803]: Starting D-Bus User Message Bus...
Feb 23 10:12:22 np0005626601.novalocal dbus-broker-launch[14560]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 23 10:12:22 np0005626601.novalocal dbus-broker-launch[14560]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 23 10:12:22 np0005626601.novalocal systemd[4803]: Started D-Bus User Message Bus.
Feb 23 10:12:22 np0005626601.novalocal dbus-broker-lau[14560]: Ready
Feb 23 10:12:22 np0005626601.novalocal systemd[4803]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 23 10:12:22 np0005626601.novalocal systemd[4803]: Created slice Slice /user.
Feb 23 10:12:22 np0005626601.novalocal systemd[4803]: podman-14454.scope: unit configures an IP firewall, but not running as root.
Feb 23 10:12:22 np0005626601.novalocal systemd[4803]: (This warning is only shown for the first unit using IP firewalling.)
Feb 23 10:12:22 np0005626601.novalocal systemd[4803]: Started podman-14454.scope.
Feb 23 10:12:22 np0005626601.novalocal systemd[4803]: Started podman-pause-df803165.scope.
Feb 23 10:12:22 np0005626601.novalocal sudo[14834]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdsmsjwoupslgpkokkvhhbrllwsfnpsv ; /usr/bin/python3'
Feb 23 10:12:22 np0005626601.novalocal sudo[14834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:12:22 np0005626601.novalocal python3[14843]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.111:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.111:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:12:22 np0005626601.novalocal python3[14843]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 23 10:12:22 np0005626601.novalocal sudo[14834]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:23 np0005626601.novalocal sshd-session[8759]: Connection closed by 38.102.83.114 port 37024
Feb 23 10:12:23 np0005626601.novalocal sshd-session[8756]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:12:23 np0005626601.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 23 10:12:23 np0005626601.novalocal systemd[1]: session-5.scope: Consumed 42.603s CPU time.
Feb 23 10:12:23 np0005626601.novalocal systemd-logind[808]: Session 5 logged out. Waiting for processes to exit.
Feb 23 10:12:23 np0005626601.novalocal systemd-logind[808]: Removed session 5.
Feb 23 10:12:23 np0005626601.novalocal sshd-session[14600]: Invalid user user from 157.20.215.3 port 36228
Feb 23 10:12:23 np0005626601.novalocal sshd-session[14600]: Connection closed by invalid user user 157.20.215.3 port 36228 [preauth]
Feb 23 10:12:25 np0005626601.novalocal sshd-session[16485]: Connection closed by authenticating user root 143.198.30.3 port 32876 [preauth]
Feb 23 10:12:25 np0005626601.novalocal sshd-session[15449]: Invalid user user from 157.20.215.3 port 36236
Feb 23 10:12:26 np0005626601.novalocal sshd-session[15449]: Connection closed by invalid user user 157.20.215.3 port 36236 [preauth]
Feb 23 10:12:27 np0005626601.novalocal sshd-session[17270]: Invalid user user from 157.20.215.3 port 58822
Feb 23 10:12:28 np0005626601.novalocal sshd-session[17270]: Connection closed by invalid user user 157.20.215.3 port 58822 [preauth]
Feb 23 10:12:29 np0005626601.novalocal sshd-session[18532]: Invalid user user from 157.20.215.3 port 58834
Feb 23 10:12:30 np0005626601.novalocal sshd-session[18532]: Connection closed by invalid user user 157.20.215.3 port 58834 [preauth]
Feb 23 10:12:32 np0005626601.novalocal sshd-session[19995]: Invalid user user from 157.20.215.3 port 58838
Feb 23 10:12:32 np0005626601.novalocal sshd-session[19995]: Connection closed by invalid user user 157.20.215.3 port 58838 [preauth]
Feb 23 10:12:34 np0005626601.novalocal sshd-session[21278]: Invalid user user from 157.20.215.3 port 58844
Feb 23 10:12:34 np0005626601.novalocal sshd-session[21278]: Connection closed by invalid user user 157.20.215.3 port 58844 [preauth]
Feb 23 10:12:36 np0005626601.novalocal sshd-session[22521]: Invalid user user from 157.20.215.3 port 58856
Feb 23 10:12:36 np0005626601.novalocal sshd-session[22521]: Connection closed by invalid user user 157.20.215.3 port 58856 [preauth]
Feb 23 10:12:38 np0005626601.novalocal sshd-session[23986]: Invalid user user from 157.20.215.3 port 37794
Feb 23 10:12:39 np0005626601.novalocal sshd-session[23986]: Connection closed by invalid user user 157.20.215.3 port 37794 [preauth]
Feb 23 10:12:40 np0005626601.novalocal sshd-session[25525]: Invalid user user from 157.20.215.3 port 37798
Feb 23 10:12:41 np0005626601.novalocal sshd-session[26734]: Connection closed by 38.102.83.129 port 55514 [preauth]
Feb 23 10:12:41 np0005626601.novalocal sshd-session[26735]: Connection closed by 38.102.83.129 port 55526 [preauth]
Feb 23 10:12:41 np0005626601.novalocal sshd-session[26738]: Unable to negotiate with 38.102.83.129 port 55540: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 23 10:12:41 np0005626601.novalocal sshd-session[26740]: Unable to negotiate with 38.102.83.129 port 55572: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 23 10:12:41 np0005626601.novalocal sshd-session[26741]: Unable to negotiate with 38.102.83.129 port 55556: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 23 10:12:41 np0005626601.novalocal sshd-session[25525]: Connection closed by invalid user user 157.20.215.3 port 37798 [preauth]
Feb 23 10:12:41 np0005626601.novalocal irqbalance[802]: Cannot change IRQ 27 affinity: Operation not permitted
Feb 23 10:12:41 np0005626601.novalocal irqbalance[802]: IRQ 27 affinity is now unmanaged
Feb 23 10:12:43 np0005626601.novalocal sshd-session[26928]: Invalid user user from 157.20.215.3 port 37806
Feb 23 10:12:43 np0005626601.novalocal sshd-session[26928]: Connection closed by invalid user user 157.20.215.3 port 37806 [preauth]
Feb 23 10:12:45 np0005626601.novalocal sshd-session[28774]: Accepted publickey for zuul from 38.102.83.114 port 46956 ssh2: RSA SHA256:BiJLa2SOE3eAuOQ4B+aHpprkSCedZrO5BRA4A2P+Trc
Feb 23 10:12:45 np0005626601.novalocal systemd-logind[808]: New session 6 of user zuul.
Feb 23 10:12:45 np0005626601.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 23 10:12:45 np0005626601.novalocal sshd-session[28774]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:12:45 np0005626601.novalocal sshd-session[28016]: Invalid user user from 157.20.215.3 port 37818
Feb 23 10:12:45 np0005626601.novalocal python3[28917]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGGAoDmc0nruX0JS9omzP8injw4ANPFs+SMg4edeaN6DbUHUNHwW7+u4MJXbAiwsbQAYhNTxoqMqTYpQlGEsFJA= zuul@np0005626600.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:12:45 np0005626601.novalocal sshd-session[28016]: Connection closed by invalid user user 157.20.215.3 port 37818 [preauth]
Feb 23 10:12:45 np0005626601.novalocal sudo[29140]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chigthgwisetbvmqejzlwrsvcmfirnmf ; /usr/bin/python3'
Feb 23 10:12:45 np0005626601.novalocal sudo[29140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:12:45 np0005626601.novalocal python3[29150]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGGAoDmc0nruX0JS9omzP8injw4ANPFs+SMg4edeaN6DbUHUNHwW7+u4MJXbAiwsbQAYhNTxoqMqTYpQlGEsFJA= zuul@np0005626600.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:12:45 np0005626601.novalocal sudo[29140]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:46 np0005626601.novalocal sudo[29585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwfhvxomunafezvcertymnobelevtiyr ; /usr/bin/python3'
Feb 23 10:12:46 np0005626601.novalocal sudo[29585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:12:46 np0005626601.novalocal python3[29596]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626601.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 23 10:12:46 np0005626601.novalocal useradd[29676]: new group: name=cloud-admin, GID=1002
Feb 23 10:12:46 np0005626601.novalocal useradd[29676]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 23 10:12:46 np0005626601.novalocal sudo[29585]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:46 np0005626601.novalocal sudo[29835]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejjmohxfacwhsqcvyblvbwhetysxybcx ; /usr/bin/python3'
Feb 23 10:12:47 np0005626601.novalocal sudo[29835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:12:47 np0005626601.novalocal python3[29845]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGGAoDmc0nruX0JS9omzP8injw4ANPFs+SMg4edeaN6DbUHUNHwW7+u4MJXbAiwsbQAYhNTxoqMqTYpQlGEsFJA= zuul@np0005626600.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 10:12:47 np0005626601.novalocal sudo[29835]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:47 np0005626601.novalocal sudo[30188]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrggdwisomyftstqbgrdjudidejrnalw ; /usr/bin/python3'
Feb 23 10:12:47 np0005626601.novalocal sudo[30188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:12:47 np0005626601.novalocal python3[30197]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:12:47 np0005626601.novalocal sudo[30188]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:47 np0005626601.novalocal sshd-session[29313]: Invalid user user from 157.20.215.3 port 36324
Feb 23 10:12:47 np0005626601.novalocal sudo[30522]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naopeidwlwdhwndrxposkhvktmlvldqr ; /usr/bin/python3'
Feb 23 10:12:47 np0005626601.novalocal sudo[30522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:12:47 np0005626601.novalocal python3[30532]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771841567.3359132-151-155383483645511/source _original_basename=tmp54lbq3_r follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:12:47 np0005626601.novalocal sshd-session[29313]: Connection closed by invalid user user 157.20.215.3 port 36324 [preauth]
Feb 23 10:12:47 np0005626601.novalocal sudo[30522]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:48 np0005626601.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:12:48 np0005626601.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:12:48 np0005626601.novalocal systemd[1]: man-db-cache-update.service: Consumed 33.936s CPU time.
Feb 23 10:12:48 np0005626601.novalocal systemd[1]: run-re1363a14680c4170a4ccf69a748b63cd.service: Deactivated successfully.
Feb 23 10:12:48 np0005626601.novalocal sudo[30751]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubskxaoydwietvwshjqyoyonhdxbjolm ; /usr/bin/python3'
Feb 23 10:12:48 np0005626601.novalocal sudo[30751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:12:48 np0005626601.novalocal python3[30753]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Feb 23 10:12:48 np0005626601.novalocal systemd[1]: Starting Hostname Service...
Feb 23 10:12:48 np0005626601.novalocal systemd[1]: Started Hostname Service.
Feb 23 10:12:48 np0005626601.novalocal systemd-hostnamed[30757]: Changed pretty hostname to 'compute-0'
Feb 23 10:12:48 compute-0 systemd-hostnamed[30757]: Hostname set to <compute-0> (static)
Feb 23 10:12:48 compute-0 NetworkManager[7689]: <info>  [1771841568.8635] hostname: static hostname changed from "np0005626601.novalocal" to "compute-0"
Feb 23 10:12:48 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 10:12:48 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 10:12:48 compute-0 sudo[30751]: pam_unix(sudo:session): session closed for user root
Feb 23 10:12:49 compute-0 sshd-session[28848]: Connection closed by 38.102.83.114 port 46956
Feb 23 10:12:49 compute-0 sshd-session[28774]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:12:49 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Feb 23 10:12:49 compute-0 systemd[1]: session-6.scope: Consumed 1.758s CPU time.
Feb 23 10:12:49 compute-0 systemd-logind[808]: Session 6 logged out. Waiting for processes to exit.
Feb 23 10:12:49 compute-0 systemd-logind[808]: Removed session 6.
Feb 23 10:12:49 compute-0 sshd-session[30726]: Invalid user user from 157.20.215.3 port 36338
Feb 23 10:12:50 compute-0 sshd-session[30726]: Connection closed by invalid user user 157.20.215.3 port 36338 [preauth]
Feb 23 10:12:51 compute-0 sshd-session[30770]: Invalid user user from 157.20.215.3 port 36354
Feb 23 10:12:52 compute-0 sshd-session[30770]: Connection closed by invalid user user 157.20.215.3 port 36354 [preauth]
Feb 23 10:12:54 compute-0 sshd-session[30772]: Invalid user user from 157.20.215.3 port 36366
Feb 23 10:12:54 compute-0 sshd-session[30772]: Connection closed by invalid user user 157.20.215.3 port 36366 [preauth]
Feb 23 10:12:56 compute-0 sshd-session[30774]: Invalid user user from 157.20.215.3 port 36372
Feb 23 10:12:56 compute-0 sshd-session[30774]: Connection closed by invalid user user 157.20.215.3 port 36372 [preauth]
Feb 23 10:12:58 compute-0 sshd-session[30776]: Invalid user user from 157.20.215.3 port 56338
Feb 23 10:12:58 compute-0 sshd-session[30776]: Connection closed by invalid user user 157.20.215.3 port 56338 [preauth]
Feb 23 10:12:58 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 10:13:01 compute-0 sshd-session[30778]: Invalid user user from 157.20.215.3 port 56354
Feb 23 10:13:01 compute-0 sshd-session[30778]: Connection closed by invalid user user 157.20.215.3 port 56354 [preauth]
Feb 23 10:13:03 compute-0 sshd-session[30780]: Invalid user user from 157.20.215.3 port 56364
Feb 23 10:13:04 compute-0 sshd-session[30780]: Connection closed by invalid user user 157.20.215.3 port 56364 [preauth]
Feb 23 10:13:06 compute-0 sshd-session[30782]: Invalid user user from 157.20.215.3 port 56378
Feb 23 10:13:06 compute-0 sshd-session[30782]: Connection closed by invalid user user 157.20.215.3 port 56378 [preauth]
Feb 23 10:13:08 compute-0 sshd-session[30784]: Invalid user user from 157.20.215.3 port 39146
Feb 23 10:13:08 compute-0 sshd-session[30784]: Connection closed by invalid user user 157.20.215.3 port 39146 [preauth]
Feb 23 10:13:10 compute-0 sshd-session[30786]: Invalid user user from 157.20.215.3 port 39156
Feb 23 10:13:10 compute-0 sshd-session[30786]: Connection closed by invalid user user 157.20.215.3 port 39156 [preauth]
Feb 23 10:13:12 compute-0 sshd-session[30788]: Invalid user user from 157.20.215.3 port 39164
Feb 23 10:13:12 compute-0 sshd-session[30788]: Connection closed by invalid user user 157.20.215.3 port 39164 [preauth]
Feb 23 10:13:14 compute-0 sshd-session[30790]: Invalid user user from 157.20.215.3 port 39176
Feb 23 10:13:15 compute-0 sshd-session[30790]: Connection closed by invalid user user 157.20.215.3 port 39176 [preauth]
Feb 23 10:13:17 compute-0 sshd-session[30792]: Invalid user user from 157.20.215.3 port 39178
Feb 23 10:13:17 compute-0 sshd-session[30792]: Connection closed by invalid user user 157.20.215.3 port 39178 [preauth]
Feb 23 10:13:18 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 10:13:19 compute-0 sshd-session[30794]: Invalid user user from 157.20.215.3 port 38300
Feb 23 10:13:19 compute-0 sshd-session[30794]: Connection closed by invalid user user 157.20.215.3 port 38300 [preauth]
Feb 23 10:13:20 compute-0 sshd-session[30801]: Connection closed by authenticating user root 143.198.30.3 port 46616 [preauth]
Feb 23 10:13:21 compute-0 sshd-session[30799]: Invalid user user from 157.20.215.3 port 38316
Feb 23 10:13:21 compute-0 sshd-session[30799]: Connection closed by invalid user user 157.20.215.3 port 38316 [preauth]
Feb 23 10:13:23 compute-0 sshd-session[30803]: Invalid user user from 157.20.215.3 port 38332
Feb 23 10:13:23 compute-0 sshd-session[30803]: Connection closed by invalid user user 157.20.215.3 port 38332 [preauth]
Feb 23 10:13:25 compute-0 sshd-session[30805]: Invalid user user from 157.20.215.3 port 38342
Feb 23 10:13:26 compute-0 sshd-session[30805]: Connection closed by invalid user user 157.20.215.3 port 38342 [preauth]
Feb 23 10:13:28 compute-0 sshd-session[30807]: Invalid user user from 157.20.215.3 port 50818
Feb 23 10:13:28 compute-0 sshd-session[30807]: Connection closed by invalid user user 157.20.215.3 port 50818 [preauth]
Feb 23 10:13:30 compute-0 sshd-session[30809]: Invalid user user from 157.20.215.3 port 50824
Feb 23 10:13:30 compute-0 sshd-session[30809]: Connection closed by invalid user user 157.20.215.3 port 50824 [preauth]
Feb 23 10:13:32 compute-0 sshd-session[30811]: Invalid user user from 157.20.215.3 port 50840
Feb 23 10:13:32 compute-0 sshd-session[30811]: Connection closed by invalid user user 157.20.215.3 port 50840 [preauth]
Feb 23 10:13:34 compute-0 sshd-session[30813]: Invalid user user from 157.20.215.3 port 50842
Feb 23 10:13:35 compute-0 sshd-session[30813]: Connection closed by invalid user user 157.20.215.3 port 50842 [preauth]
Feb 23 10:13:36 compute-0 sshd-session[30815]: Invalid user user from 157.20.215.3 port 50846
Feb 23 10:13:37 compute-0 sshd-session[30815]: Connection closed by invalid user user 157.20.215.3 port 50846 [preauth]
Feb 23 10:13:39 compute-0 sshd-session[30817]: Invalid user user from 157.20.215.3 port 50612
Feb 23 10:13:39 compute-0 sshd-session[30817]: Connection closed by invalid user user 157.20.215.3 port 50612 [preauth]
Feb 23 10:13:41 compute-0 sshd-session[30819]: Invalid user user from 157.20.215.3 port 50628
Feb 23 10:13:41 compute-0 sshd-session[30819]: Connection closed by invalid user user 157.20.215.3 port 50628 [preauth]
Feb 23 10:13:43 compute-0 sshd-session[30821]: Invalid user user from 157.20.215.3 port 50640
Feb 23 10:13:43 compute-0 sshd-session[30821]: Connection closed by invalid user user 157.20.215.3 port 50640 [preauth]
Feb 23 10:13:45 compute-0 sshd-session[30823]: Invalid user user from 157.20.215.3 port 50652
Feb 23 10:13:45 compute-0 sshd-session[30823]: Connection closed by invalid user user 157.20.215.3 port 50652 [preauth]
Feb 23 10:13:47 compute-0 sshd-session[30825]: Invalid user user from 157.20.215.3 port 46586
Feb 23 10:13:48 compute-0 sshd-session[30825]: Connection closed by invalid user user 157.20.215.3 port 46586 [preauth]
Feb 23 10:13:50 compute-0 sshd-session[30827]: Invalid user user from 157.20.215.3 port 46592
Feb 23 10:13:50 compute-0 sshd-session[30827]: Connection closed by invalid user user 157.20.215.3 port 46592 [preauth]
Feb 23 10:13:52 compute-0 sshd-session[30829]: Invalid user user from 157.20.215.3 port 46602
Feb 23 10:13:52 compute-0 sshd-session[30829]: Connection closed by invalid user user 157.20.215.3 port 46602 [preauth]
Feb 23 10:13:54 compute-0 sshd-session[30831]: Invalid user user from 157.20.215.3 port 46610
Feb 23 10:13:54 compute-0 sshd-session[30831]: Connection closed by invalid user user 157.20.215.3 port 46610 [preauth]
Feb 23 10:13:56 compute-0 sshd-session[30833]: Invalid user user from 157.20.215.3 port 46614
Feb 23 10:13:56 compute-0 sshd-session[30833]: Connection closed by invalid user user 157.20.215.3 port 46614 [preauth]
Feb 23 10:13:58 compute-0 sshd-session[30835]: Invalid user user from 157.20.215.3 port 56134
Feb 23 10:13:59 compute-0 sshd-session[30835]: Connection closed by invalid user user 157.20.215.3 port 56134 [preauth]
Feb 23 10:14:00 compute-0 sshd-session[30837]: Invalid user user from 157.20.215.3 port 56146
Feb 23 10:14:01 compute-0 sshd-session[30837]: Connection closed by invalid user user 157.20.215.3 port 56146 [preauth]
Feb 23 10:14:02 compute-0 sshd-session[30839]: Invalid user user from 157.20.215.3 port 56160
Feb 23 10:14:03 compute-0 sshd-session[30839]: Connection closed by invalid user user 157.20.215.3 port 56160 [preauth]
Feb 23 10:14:05 compute-0 sshd-session[30841]: Invalid user user from 157.20.215.3 port 56168
Feb 23 10:14:05 compute-0 sshd-session[30841]: Connection closed by invalid user user 157.20.215.3 port 56168 [preauth]
Feb 23 10:14:07 compute-0 sshd-session[30843]: Invalid user user from 157.20.215.3 port 52812
Feb 23 10:14:07 compute-0 sshd-session[30843]: Connection closed by invalid user user 157.20.215.3 port 52812 [preauth]
Feb 23 10:14:09 compute-0 sshd-session[30845]: Invalid user user from 157.20.215.3 port 52828
Feb 23 10:14:10 compute-0 sshd-session[30845]: Connection closed by invalid user user 157.20.215.3 port 52828 [preauth]
Feb 23 10:14:11 compute-0 sshd-session[30848]: Invalid user user from 157.20.215.3 port 52840
Feb 23 10:14:12 compute-0 sshd-session[30848]: Connection closed by invalid user user 157.20.215.3 port 52840 [preauth]
Feb 23 10:14:14 compute-0 sshd-session[30850]: Invalid user user from 157.20.215.3 port 52844
Feb 23 10:14:14 compute-0 sshd-session[30852]: Connection closed by authenticating user root 143.198.30.3 port 54202 [preauth]
Feb 23 10:14:14 compute-0 sshd-session[30850]: Connection closed by invalid user user 157.20.215.3 port 52844 [preauth]
Feb 23 10:14:16 compute-0 sshd-session[30854]: Invalid user ubuntu from 157.20.215.3 port 52854
Feb 23 10:14:16 compute-0 sshd-session[30854]: Connection closed by invalid user ubuntu 157.20.215.3 port 52854 [preauth]
Feb 23 10:14:18 compute-0 sshd-session[30856]: Invalid user ubuntu from 157.20.215.3 port 46098
Feb 23 10:14:18 compute-0 sshd-session[30856]: Connection closed by invalid user ubuntu 157.20.215.3 port 46098 [preauth]
Feb 23 10:14:20 compute-0 sshd-session[30859]: Invalid user ubuntu from 157.20.215.3 port 46104
Feb 23 10:14:21 compute-0 sshd-session[30859]: Connection closed by invalid user ubuntu 157.20.215.3 port 46104 [preauth]
Feb 23 10:14:22 compute-0 sshd-session[30861]: Invalid user ubuntu from 157.20.215.3 port 46120
Feb 23 10:14:23 compute-0 sshd-session[30861]: Connection closed by invalid user ubuntu 157.20.215.3 port 46120 [preauth]
Feb 23 10:14:25 compute-0 sshd-session[30863]: Invalid user ubuntu from 157.20.215.3 port 46132
Feb 23 10:14:25 compute-0 sshd-session[30863]: Connection closed by invalid user ubuntu 157.20.215.3 port 46132 [preauth]
Feb 23 10:14:27 compute-0 sshd-session[30865]: Invalid user ubuntu from 157.20.215.3 port 52842
Feb 23 10:14:27 compute-0 sshd-session[30865]: Connection closed by invalid user ubuntu 157.20.215.3 port 52842 [preauth]
Feb 23 10:14:29 compute-0 sshd-session[30867]: Invalid user ubuntu from 157.20.215.3 port 52848
Feb 23 10:14:29 compute-0 sshd-session[30867]: Connection closed by invalid user ubuntu 157.20.215.3 port 52848 [preauth]
Feb 23 10:14:31 compute-0 sshd-session[30869]: Invalid user ubuntu from 157.20.215.3 port 52856
Feb 23 10:14:31 compute-0 sshd-session[30869]: Connection closed by invalid user ubuntu 157.20.215.3 port 52856 [preauth]
Feb 23 10:14:33 compute-0 sshd-session[30871]: Invalid user ubuntu from 157.20.215.3 port 52870
Feb 23 10:14:33 compute-0 sshd-session[30871]: Connection closed by invalid user ubuntu 157.20.215.3 port 52870 [preauth]
Feb 23 10:14:35 compute-0 sshd-session[30873]: Invalid user ubuntu from 157.20.215.3 port 52880
Feb 23 10:14:36 compute-0 sshd-session[30873]: Connection closed by invalid user ubuntu 157.20.215.3 port 52880 [preauth]
Feb 23 10:14:38 compute-0 sshd-session[30875]: Invalid user ubuntu from 157.20.215.3 port 43274
Feb 23 10:14:38 compute-0 sshd-session[30875]: Connection closed by invalid user ubuntu 157.20.215.3 port 43274 [preauth]
Feb 23 10:14:40 compute-0 sshd-session[30877]: Invalid user ubuntu from 157.20.215.3 port 43278
Feb 23 10:14:40 compute-0 sshd-session[30877]: Connection closed by invalid user ubuntu 157.20.215.3 port 43278 [preauth]
Feb 23 10:14:42 compute-0 sshd-session[30879]: Invalid user ubuntu from 157.20.215.3 port 43294
Feb 23 10:14:42 compute-0 sshd-session[30879]: Connection closed by invalid user ubuntu 157.20.215.3 port 43294 [preauth]
Feb 23 10:14:44 compute-0 sshd-session[30881]: Invalid user ubuntu from 157.20.215.3 port 43296
Feb 23 10:14:45 compute-0 sshd-session[30881]: Connection closed by invalid user ubuntu 157.20.215.3 port 43296 [preauth]
Feb 23 10:14:46 compute-0 sshd-session[30883]: Invalid user ubuntu from 157.20.215.3 port 43312
Feb 23 10:14:47 compute-0 sshd-session[30883]: Connection closed by invalid user ubuntu 157.20.215.3 port 43312 [preauth]
Feb 23 10:14:49 compute-0 sshd-session[30885]: Invalid user ubuntu from 157.20.215.3 port 47816
Feb 23 10:14:49 compute-0 sshd-session[30885]: Connection closed by invalid user ubuntu 157.20.215.3 port 47816 [preauth]
Feb 23 10:14:51 compute-0 sshd-session[30887]: Invalid user ubuntu from 157.20.215.3 port 47822
Feb 23 10:14:51 compute-0 sshd-session[30887]: Connection closed by invalid user ubuntu 157.20.215.3 port 47822 [preauth]
Feb 23 10:14:53 compute-0 sshd-session[30889]: Invalid user ubuntu from 157.20.215.3 port 47836
Feb 23 10:14:53 compute-0 sshd-session[30889]: Connection closed by invalid user ubuntu 157.20.215.3 port 47836 [preauth]
Feb 23 10:14:55 compute-0 sshd-session[30891]: Invalid user ubuntu from 157.20.215.3 port 47850
Feb 23 10:14:55 compute-0 sshd-session[30891]: Connection closed by invalid user ubuntu 157.20.215.3 port 47850 [preauth]
Feb 23 10:14:57 compute-0 sshd-session[30893]: Invalid user ubuntu from 157.20.215.3 port 58422
Feb 23 10:14:58 compute-0 sshd-session[30893]: Connection closed by invalid user ubuntu 157.20.215.3 port 58422 [preauth]
Feb 23 10:14:59 compute-0 sshd-session[30895]: Invalid user ubuntu from 157.20.215.3 port 58438
Feb 23 10:15:00 compute-0 sshd-session[30895]: Connection closed by invalid user ubuntu 157.20.215.3 port 58438 [preauth]
Feb 23 10:15:02 compute-0 sshd-session[30897]: Invalid user ubuntu from 157.20.215.3 port 58450
Feb 23 10:15:02 compute-0 sshd-session[30897]: Connection closed by invalid user ubuntu 157.20.215.3 port 58450 [preauth]
Feb 23 10:15:04 compute-0 sshd-session[30899]: Invalid user ubuntu from 157.20.215.3 port 58458
Feb 23 10:15:04 compute-0 sshd-session[30899]: Connection closed by invalid user ubuntu 157.20.215.3 port 58458 [preauth]
Feb 23 10:15:06 compute-0 sshd-session[30901]: Invalid user ubuntu from 157.20.215.3 port 58462
Feb 23 10:15:06 compute-0 sshd-session[30901]: Connection closed by invalid user ubuntu 157.20.215.3 port 58462 [preauth]
Feb 23 10:15:07 compute-0 sshd-session[30905]: Connection closed by authenticating user root 143.198.30.3 port 56554 [preauth]
Feb 23 10:15:08 compute-0 sshd-session[30903]: Invalid user ubuntu from 157.20.215.3 port 40490
Feb 23 10:15:08 compute-0 sshd-session[30903]: Connection closed by invalid user ubuntu 157.20.215.3 port 40490 [preauth]
Feb 23 10:15:10 compute-0 sshd-session[30907]: Invalid user ubuntu from 157.20.215.3 port 40498
Feb 23 10:15:11 compute-0 sshd-session[30907]: Connection closed by invalid user ubuntu 157.20.215.3 port 40498 [preauth]
Feb 23 10:15:12 compute-0 sshd-session[30909]: Invalid user ubuntu from 157.20.215.3 port 40504
Feb 23 10:15:13 compute-0 sshd-session[30909]: Connection closed by invalid user ubuntu 157.20.215.3 port 40504 [preauth]
Feb 23 10:15:14 compute-0 sshd-session[30911]: Invalid user ubuntu from 157.20.215.3 port 40514
Feb 23 10:15:15 compute-0 sshd-session[30911]: Connection closed by invalid user ubuntu 157.20.215.3 port 40514 [preauth]
Feb 23 10:15:16 compute-0 sshd-session[30913]: Invalid user ubuntu from 157.20.215.3 port 40522
Feb 23 10:15:17 compute-0 sshd-session[30913]: Connection closed by invalid user ubuntu 157.20.215.3 port 40522 [preauth]
Feb 23 10:15:19 compute-0 sshd-session[30915]: Invalid user ubuntu from 157.20.215.3 port 53256
Feb 23 10:15:19 compute-0 sshd-session[30915]: Connection closed by invalid user ubuntu 157.20.215.3 port 53256 [preauth]
Feb 23 10:15:21 compute-0 sshd-session[30917]: Invalid user ubuntu from 157.20.215.3 port 53270
Feb 23 10:15:21 compute-0 sshd-session[30917]: Connection closed by invalid user ubuntu 157.20.215.3 port 53270 [preauth]
Feb 23 10:15:23 compute-0 sshd-session[30919]: Invalid user ubuntu from 157.20.215.3 port 53286
Feb 23 10:15:23 compute-0 sshd-session[30919]: Connection closed by invalid user ubuntu 157.20.215.3 port 53286 [preauth]
Feb 23 10:15:25 compute-0 sshd-session[30921]: Invalid user ubuntu from 157.20.215.3 port 53302
Feb 23 10:15:25 compute-0 sshd-session[30921]: Connection closed by invalid user ubuntu 157.20.215.3 port 53302 [preauth]
Feb 23 10:15:27 compute-0 sshd-session[30923]: Invalid user ubuntu from 157.20.215.3 port 45800
Feb 23 10:15:27 compute-0 sshd-session[30923]: Connection closed by invalid user ubuntu 157.20.215.3 port 45800 [preauth]
Feb 23 10:15:29 compute-0 sshd-session[30925]: Invalid user ubuntu from 157.20.215.3 port 45808
Feb 23 10:15:30 compute-0 sshd-session[30925]: Connection closed by invalid user ubuntu 157.20.215.3 port 45808 [preauth]
Feb 23 10:15:31 compute-0 sshd-session[30927]: Invalid user ubuntu from 157.20.215.3 port 45816
Feb 23 10:15:32 compute-0 sshd-session[30927]: Connection closed by invalid user ubuntu 157.20.215.3 port 45816 [preauth]
Feb 23 10:15:33 compute-0 sshd-session[30929]: Invalid user ubuntu from 157.20.215.3 port 45820
Feb 23 10:15:34 compute-0 sshd-session[30929]: Connection closed by invalid user ubuntu 157.20.215.3 port 45820 [preauth]
Feb 23 10:15:36 compute-0 sshd-session[30931]: Invalid user ubuntu from 157.20.215.3 port 45832
Feb 23 10:15:36 compute-0 sshd-session[30931]: Connection closed by invalid user ubuntu 157.20.215.3 port 45832 [preauth]
Feb 23 10:15:38 compute-0 sshd-session[30933]: Invalid user ubuntu from 157.20.215.3 port 40904
Feb 23 10:15:38 compute-0 sshd-session[30933]: Connection closed by invalid user ubuntu 157.20.215.3 port 40904 [preauth]
Feb 23 10:15:40 compute-0 sshd-session[30935]: Invalid user ubuntu from 157.20.215.3 port 40914
Feb 23 10:15:41 compute-0 sshd-session[30935]: Connection closed by invalid user ubuntu 157.20.215.3 port 40914 [preauth]
Feb 23 10:15:42 compute-0 sshd-session[30937]: Invalid user ubuntu from 157.20.215.3 port 40918
Feb 23 10:15:43 compute-0 sshd-session[30937]: Connection closed by invalid user ubuntu 157.20.215.3 port 40918 [preauth]
Feb 23 10:15:45 compute-0 sshd-session[30939]: Invalid user ubuntu from 157.20.215.3 port 40926
Feb 23 10:15:45 compute-0 sshd-session[30939]: Connection closed by invalid user ubuntu 157.20.215.3 port 40926 [preauth]
Feb 23 10:15:46 compute-0 sshd-session[30941]: Invalid user ubuntu from 157.20.215.3 port 40940
Feb 23 10:15:47 compute-0 sshd-session[30941]: Connection closed by invalid user ubuntu 157.20.215.3 port 40940 [preauth]
Feb 23 10:15:49 compute-0 sshd-session[30943]: Invalid user ubuntu from 157.20.215.3 port 35366
Feb 23 10:15:49 compute-0 sshd-session[30943]: Connection closed by invalid user ubuntu 157.20.215.3 port 35366 [preauth]
Feb 23 10:15:51 compute-0 sshd-session[30945]: Invalid user ubuntu from 157.20.215.3 port 35368
Feb 23 10:15:51 compute-0 sshd-session[30945]: Connection closed by invalid user ubuntu 157.20.215.3 port 35368 [preauth]
Feb 23 10:15:53 compute-0 sshd-session[30947]: Invalid user ubuntu from 157.20.215.3 port 35370
Feb 23 10:15:53 compute-0 sshd-session[30947]: Connection closed by invalid user ubuntu 157.20.215.3 port 35370 [preauth]
Feb 23 10:15:56 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Feb 23 10:15:56 compute-0 sshd-session[30949]: Invalid user ubuntu from 157.20.215.3 port 35376
Feb 23 10:15:56 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 23 10:15:56 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Feb 23 10:15:56 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 23 10:15:56 compute-0 sshd-session[30949]: Connection closed by invalid user ubuntu 157.20.215.3 port 35376 [preauth]
Feb 23 10:15:58 compute-0 sshd-session[30953]: Invalid user ubuntu from 157.20.215.3 port 52262
Feb 23 10:15:58 compute-0 sshd-session[30953]: Connection closed by invalid user ubuntu 157.20.215.3 port 52262 [preauth]
Feb 23 10:16:00 compute-0 sshd-session[30955]: Invalid user ubuntu from 157.20.215.3 port 52274
Feb 23 10:16:00 compute-0 sshd-session[30955]: Connection closed by invalid user ubuntu 157.20.215.3 port 52274 [preauth]
Feb 23 10:16:00 compute-0 sshd-session[30957]: Connection closed by authenticating user root 143.198.30.3 port 46168 [preauth]
Feb 23 10:16:01 compute-0 sshd-session[30961]: Connection closed by 165.227.79.48 port 48834
Feb 23 10:16:02 compute-0 sshd-session[30959]: Invalid user ubuntu from 157.20.215.3 port 52282
Feb 23 10:16:03 compute-0 sshd-session[30959]: Connection closed by invalid user ubuntu 157.20.215.3 port 52282 [preauth]
Feb 23 10:16:04 compute-0 sshd-session[30962]: Invalid user ubuntu from 157.20.215.3 port 52290
Feb 23 10:16:05 compute-0 sshd-session[30962]: Connection closed by invalid user ubuntu 157.20.215.3 port 52290 [preauth]
Feb 23 10:16:07 compute-0 sshd-session[30964]: Invalid user ubuntu from 157.20.215.3 port 52300
Feb 23 10:16:07 compute-0 sshd-session[30964]: Connection closed by invalid user ubuntu 157.20.215.3 port 52300 [preauth]
Feb 23 10:16:09 compute-0 sshd-session[30966]: Invalid user ubuntu from 157.20.215.3 port 56358
Feb 23 10:16:09 compute-0 sshd-session[30966]: Connection closed by invalid user ubuntu 157.20.215.3 port 56358 [preauth]
Feb 23 10:16:11 compute-0 sshd-session[30968]: Invalid user ubuntu from 157.20.215.3 port 56368
Feb 23 10:16:11 compute-0 sshd-session[30968]: Connection closed by invalid user ubuntu 157.20.215.3 port 56368 [preauth]
Feb 23 10:16:13 compute-0 sshd-session[30970]: Invalid user ubuntu from 157.20.215.3 port 56380
Feb 23 10:16:14 compute-0 sshd-session[30970]: Connection closed by invalid user ubuntu 157.20.215.3 port 56380 [preauth]
Feb 23 10:16:16 compute-0 sshd-session[30972]: Invalid user ubuntu from 157.20.215.3 port 56382
Feb 23 10:16:16 compute-0 sshd-session[30972]: Connection closed by invalid user ubuntu 157.20.215.3 port 56382 [preauth]
Feb 23 10:16:18 compute-0 sshd-session[30974]: Invalid user ubuntu from 157.20.215.3 port 59620
Feb 23 10:16:18 compute-0 sshd-session[30974]: Connection closed by invalid user ubuntu 157.20.215.3 port 59620 [preauth]
Feb 23 10:16:20 compute-0 sshd-session[30976]: Invalid user ubuntu from 157.20.215.3 port 59634
Feb 23 10:16:20 compute-0 sshd-session[30976]: Connection closed by invalid user ubuntu 157.20.215.3 port 59634 [preauth]
Feb 23 10:16:22 compute-0 sshd-session[30978]: Invalid user ubuntu from 157.20.215.3 port 59636
Feb 23 10:16:22 compute-0 sshd-session[30978]: Connection closed by invalid user ubuntu 157.20.215.3 port 59636 [preauth]
Feb 23 10:16:24 compute-0 sshd-session[30980]: Invalid user ubuntu from 157.20.215.3 port 59652
Feb 23 10:16:25 compute-0 sshd-session[30980]: Connection closed by invalid user ubuntu 157.20.215.3 port 59652 [preauth]
Feb 23 10:16:27 compute-0 sshd-session[30982]: Invalid user ubuntu from 157.20.215.3 port 59666
Feb 23 10:16:27 compute-0 sshd-session[30982]: Connection closed by invalid user ubuntu 157.20.215.3 port 59666 [preauth]
Feb 23 10:16:29 compute-0 sshd-session[30984]: Invalid user ubuntu from 157.20.215.3 port 54380
Feb 23 10:16:29 compute-0 sshd-session[30986]: Accepted publickey for zuul from 38.102.83.129 port 35548 ssh2: RSA SHA256:BiJLa2SOE3eAuOQ4B+aHpprkSCedZrO5BRA4A2P+Trc
Feb 23 10:16:29 compute-0 systemd-logind[808]: New session 7 of user zuul.
Feb 23 10:16:29 compute-0 systemd[1]: Started Session 7 of User zuul.
Feb 23 10:16:29 compute-0 sshd-session[30986]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:16:29 compute-0 sshd-session[30984]: Connection closed by invalid user ubuntu 157.20.215.3 port 54380 [preauth]
Feb 23 10:16:29 compute-0 python3[31062]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:16:31 compute-0 sudo[31178]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjdwslmrtnrjhblmkusrnuvwlgauwwq ; /usr/bin/python3'
Feb 23 10:16:31 compute-0 sudo[31178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:31 compute-0 python3[31180]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:16:31 compute-0 sudo[31178]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:31 compute-0 sshd-session[31085]: Invalid user ubuntu from 157.20.215.3 port 54384
Feb 23 10:16:31 compute-0 sudo[31251]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfbybusdsehuijkvvcyxbkpgntulwvsf ; /usr/bin/python3'
Feb 23 10:16:31 compute-0 sudo[31251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:31 compute-0 python3[31253]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771841791.10098-34400-73515196300380/source mode=0755 _original_basename=delorean.repo follow=False checksum=c7624fe5e858d4139de1ac159778eb6fd097c2ca backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:16:31 compute-0 sudo[31251]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:31 compute-0 sudo[31277]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxsavefqmfkhmhvvtouebesrqciuiiag ; /usr/bin/python3'
Feb 23 10:16:31 compute-0 sudo[31277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:31 compute-0 python3[31279]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:16:31 compute-0 sudo[31277]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:31 compute-0 sshd-session[31085]: Connection closed by invalid user ubuntu 157.20.215.3 port 54384 [preauth]
Feb 23 10:16:32 compute-0 sudo[31350]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bievcyaeadegfuhcdlqggcbqhuakzdhc ; /usr/bin/python3'
Feb 23 10:16:32 compute-0 sudo[31350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:32 compute-0 python3[31352]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771841791.10098-34400-73515196300380/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:16:32 compute-0 sudo[31350]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:32 compute-0 sudo[31376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuwiiajuejwaaiaicxgtguqynvsjryco ; /usr/bin/python3'
Feb 23 10:16:32 compute-0 sudo[31376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:32 compute-0 python3[31378]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:16:32 compute-0 sudo[31376]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:32 compute-0 sudo[31451]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvynrjpotyznywfmiuinppazcoycipqv ; /usr/bin/python3'
Feb 23 10:16:32 compute-0 sudo[31451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:32 compute-0 python3[31453]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771841791.10098-34400-73515196300380/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:16:32 compute-0 sudo[31451]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:32 compute-0 sudo[31477]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzymmpuqohjkwgyrmvsatwdghjitgirz ; /usr/bin/python3'
Feb 23 10:16:32 compute-0 sudo[31477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:32 compute-0 python3[31479]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:16:32 compute-0 sudo[31477]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:33 compute-0 sudo[31550]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izjgwpyyropsrdmxfbwmidhsqosnmdmw ; /usr/bin/python3'
Feb 23 10:16:33 compute-0 sudo[31550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:33 compute-0 python3[31552]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771841791.10098-34400-73515196300380/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:16:33 compute-0 sudo[31550]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:33 compute-0 sudo[31576]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylxluwliojtrylwchleconjuoqwrkxcz ; /usr/bin/python3'
Feb 23 10:16:33 compute-0 sudo[31576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:33 compute-0 python3[31578]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:16:33 compute-0 sudo[31576]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:33 compute-0 sudo[31649]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyantopisxobxbvmrxuyklvpwjffmwfk ; /usr/bin/python3'
Feb 23 10:16:33 compute-0 sudo[31649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:33 compute-0 python3[31651]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771841791.10098-34400-73515196300380/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:16:33 compute-0 sudo[31649]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:33 compute-0 sudo[31675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwplzmfokidgoxaauzfailtloiwjazhr ; /usr/bin/python3'
Feb 23 10:16:33 compute-0 sudo[31675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:33 compute-0 sshd-session[31379]: Invalid user ubuntu from 157.20.215.3 port 54390
Feb 23 10:16:33 compute-0 python3[31677]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:16:33 compute-0 sudo[31675]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:33 compute-0 sudo[31748]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isxlprletadtralmbodunyigkbipomug ; /usr/bin/python3'
Feb 23 10:16:33 compute-0 sudo[31748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:34 compute-0 python3[31750]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771841791.10098-34400-73515196300380/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:16:34 compute-0 sudo[31748]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:34 compute-0 sudo[31774]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-froklmycsvcgtttyqfpmwmenlnfzxyem ; /usr/bin/python3'
Feb 23 10:16:34 compute-0 sudo[31774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:34 compute-0 sshd-session[31379]: Connection closed by invalid user ubuntu 157.20.215.3 port 54390 [preauth]
Feb 23 10:16:34 compute-0 python3[31776]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 10:16:34 compute-0 sudo[31774]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:34 compute-0 sudo[31847]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drtweuktdwwzedrfyujnzgsyvynmytsr ; /usr/bin/python3'
Feb 23 10:16:34 compute-0 sudo[31847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:16:34 compute-0 python3[31849]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771841791.10098-34400-73515196300380/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=06a0a916cb7cbc51b08d6616a672f1322305cccf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:16:34 compute-0 sudo[31847]: pam_unix(sudo:session): session closed for user root
Feb 23 10:16:36 compute-0 sshd-session[31850]: Invalid user ubuntu from 157.20.215.3 port 54402
Feb 23 10:16:36 compute-0 sshd-session[31850]: Connection closed by invalid user ubuntu 157.20.215.3 port 54402 [preauth]
Feb 23 10:16:37 compute-0 sshd-session[31878]: Connection closed by 192.168.122.11 port 35770 [preauth]
Feb 23 10:16:37 compute-0 sshd-session[31879]: Connection closed by 192.168.122.11 port 35786 [preauth]
Feb 23 10:16:37 compute-0 sshd-session[31880]: Unable to negotiate with 192.168.122.11 port 35794: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 23 10:16:37 compute-0 sshd-session[31881]: Unable to negotiate with 192.168.122.11 port 35806: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 23 10:16:37 compute-0 sshd-session[31882]: Unable to negotiate with 192.168.122.11 port 35818: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 23 10:16:38 compute-0 sshd-session[31876]: Invalid user ubuntu from 157.20.215.3 port 51224
Feb 23 10:16:38 compute-0 sshd-session[31876]: Connection closed by invalid user ubuntu 157.20.215.3 port 51224 [preauth]
Feb 23 10:16:40 compute-0 sshd-session[31888]: Invalid user ubuntu from 157.20.215.3 port 51236
Feb 23 10:16:41 compute-0 sshd-session[31888]: Connection closed by invalid user ubuntu 157.20.215.3 port 51236 [preauth]
Feb 23 10:16:42 compute-0 sshd-session[31890]: Invalid user telecomadmin from 185.156.73.233 port 36774
Feb 23 10:16:42 compute-0 sshd-session[31890]: Connection closed by invalid user telecomadmin 185.156.73.233 port 36774 [preauth]
Feb 23 10:16:43 compute-0 sshd-session[31892]: Invalid user ubuntu from 157.20.215.3 port 51252
Feb 23 10:16:43 compute-0 sshd-session[31892]: Connection closed by invalid user ubuntu 157.20.215.3 port 51252 [preauth]
Feb 23 10:16:45 compute-0 sshd-session[31894]: Invalid user ubuntu from 157.20.215.3 port 51268
Feb 23 10:16:45 compute-0 sshd-session[31894]: Connection closed by invalid user ubuntu 157.20.215.3 port 51268 [preauth]
Feb 23 10:16:47 compute-0 sshd-session[31896]: Invalid user ubuntu from 157.20.215.3 port 43918
Feb 23 10:16:47 compute-0 sshd-session[31896]: Connection closed by invalid user ubuntu 157.20.215.3 port 43918 [preauth]
Feb 23 10:16:49 compute-0 sshd-session[31898]: Invalid user ubuntu from 157.20.215.3 port 43934
Feb 23 10:16:50 compute-0 sshd-session[31898]: Connection closed by invalid user ubuntu 157.20.215.3 port 43934 [preauth]
Feb 23 10:16:52 compute-0 sshd-session[31900]: Invalid user ubuntu from 157.20.215.3 port 43946
Feb 23 10:16:52 compute-0 sshd-session[31900]: Connection closed by invalid user ubuntu 157.20.215.3 port 43946 [preauth]
Feb 23 10:16:53 compute-0 sshd-session[31902]: Connection closed by authenticating user root 143.198.30.3 port 58718 [preauth]
Feb 23 10:16:55 compute-0 sshd-session[31904]: Invalid user ubuntu from 157.20.215.3 port 43948
Feb 23 10:16:55 compute-0 sshd-session[31904]: Connection closed by invalid user ubuntu 157.20.215.3 port 43948 [preauth]
Feb 23 10:16:57 compute-0 sshd-session[31906]: Invalid user ubuntu from 157.20.215.3 port 40372
Feb 23 10:16:58 compute-0 sshd-session[31906]: Connection closed by invalid user ubuntu 157.20.215.3 port 40372 [preauth]
Feb 23 10:16:59 compute-0 sshd-session[31908]: Invalid user ubuntu from 157.20.215.3 port 40386
Feb 23 10:17:00 compute-0 sshd-session[31908]: Connection closed by invalid user ubuntu 157.20.215.3 port 40386 [preauth]
Feb 23 10:17:02 compute-0 sshd-session[31910]: Invalid user ubuntu from 157.20.215.3 port 40394
Feb 23 10:17:02 compute-0 sshd-session[31910]: Connection closed by invalid user ubuntu 157.20.215.3 port 40394 [preauth]
Feb 23 10:17:04 compute-0 sshd-session[31912]: Invalid user ubuntu from 157.20.215.3 port 40398
Feb 23 10:17:04 compute-0 sshd-session[31912]: Connection closed by invalid user ubuntu 157.20.215.3 port 40398 [preauth]
Feb 23 10:17:06 compute-0 sshd-session[31914]: Invalid user ubuntu from 157.20.215.3 port 40408
Feb 23 10:17:06 compute-0 sshd-session[31914]: Connection closed by invalid user ubuntu 157.20.215.3 port 40408 [preauth]
Feb 23 10:17:09 compute-0 sshd-session[31916]: Invalid user ubuntu from 157.20.215.3 port 36776
Feb 23 10:17:10 compute-0 sshd-session[31916]: Connection closed by invalid user ubuntu 157.20.215.3 port 36776 [preauth]
Feb 23 10:17:12 compute-0 sshd-session[31918]: Invalid user ubuntu from 157.20.215.3 port 36778
Feb 23 10:17:12 compute-0 sshd-session[31918]: Connection closed by invalid user ubuntu 157.20.215.3 port 36778 [preauth]
Feb 23 10:17:14 compute-0 sshd-session[31920]: Invalid user ubuntu from 157.20.215.3 port 36788
Feb 23 10:17:14 compute-0 sshd-session[31920]: Connection closed by invalid user ubuntu 157.20.215.3 port 36788 [preauth]
Feb 23 10:17:16 compute-0 sshd-session[31922]: Invalid user ubuntu from 157.20.215.3 port 36802
Feb 23 10:17:16 compute-0 sshd-session[31922]: Connection closed by invalid user ubuntu 157.20.215.3 port 36802 [preauth]
Feb 23 10:17:19 compute-0 sshd-session[31924]: Invalid user debian from 157.20.215.3 port 57402
Feb 23 10:17:19 compute-0 sshd-session[31924]: Connection closed by invalid user debian 157.20.215.3 port 57402 [preauth]
Feb 23 10:17:21 compute-0 sshd-session[31926]: Invalid user debian from 157.20.215.3 port 57412
Feb 23 10:17:21 compute-0 sshd-session[31926]: Connection closed by invalid user debian 157.20.215.3 port 57412 [preauth]
Feb 23 10:17:23 compute-0 sshd-session[31928]: Invalid user debian from 157.20.215.3 port 57420
Feb 23 10:17:23 compute-0 sshd-session[31928]: Connection closed by invalid user debian 157.20.215.3 port 57420 [preauth]
Feb 23 10:17:25 compute-0 sshd-session[31930]: Invalid user debian from 157.20.215.3 port 57430
Feb 23 10:17:26 compute-0 sshd-session[31930]: Connection closed by invalid user debian 157.20.215.3 port 57430 [preauth]
Feb 23 10:17:28 compute-0 sshd-session[31933]: Invalid user debian from 157.20.215.3 port 48536
Feb 23 10:17:29 compute-0 sshd-session[31933]: Connection closed by invalid user debian 157.20.215.3 port 48536 [preauth]
Feb 23 10:17:30 compute-0 sshd-session[31935]: Invalid user debian from 157.20.215.3 port 48550
Feb 23 10:17:30 compute-0 sshd-session[31935]: Connection closed by invalid user debian 157.20.215.3 port 48550 [preauth]
Feb 23 10:17:32 compute-0 sshd-session[31937]: Invalid user debian from 157.20.215.3 port 48560
Feb 23 10:17:32 compute-0 sshd-session[31937]: Connection closed by invalid user debian 157.20.215.3 port 48560 [preauth]
Feb 23 10:17:34 compute-0 sshd-session[31939]: Invalid user debian from 157.20.215.3 port 48564
Feb 23 10:17:35 compute-0 sshd-session[31939]: Connection closed by invalid user debian 157.20.215.3 port 48564 [preauth]
Feb 23 10:17:36 compute-0 sshd-session[31941]: Invalid user debian from 157.20.215.3 port 48570
Feb 23 10:17:37 compute-0 sshd-session[31941]: Connection closed by invalid user debian 157.20.215.3 port 48570 [preauth]
Feb 23 10:17:38 compute-0 python3[31966]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:17:40 compute-0 sshd-session[31968]: Invalid user debian from 157.20.215.3 port 38612
Feb 23 10:17:40 compute-0 sshd-session[31968]: Connection closed by invalid user debian 157.20.215.3 port 38612 [preauth]
Feb 23 10:17:42 compute-0 sshd-session[31970]: Invalid user debian from 157.20.215.3 port 38628
Feb 23 10:17:42 compute-0 sshd-session[31970]: Connection closed by invalid user debian 157.20.215.3 port 38628 [preauth]
Feb 23 10:17:44 compute-0 sshd-session[31972]: Connection closed by authenticating user root 143.198.30.3 port 35750 [preauth]
Feb 23 10:17:45 compute-0 sshd-session[31974]: Invalid user debian from 157.20.215.3 port 38632
Feb 23 10:17:46 compute-0 sshd-session[31974]: Connection closed by invalid user debian 157.20.215.3 port 38632 [preauth]
Feb 23 10:17:48 compute-0 sshd-session[31976]: Invalid user debian from 157.20.215.3 port 43096
Feb 23 10:17:48 compute-0 sshd-session[31976]: Connection closed by invalid user debian 157.20.215.3 port 43096 [preauth]
Feb 23 10:17:50 compute-0 sshd-session[31978]: Invalid user debian from 157.20.215.3 port 43110
Feb 23 10:17:50 compute-0 sshd-session[31978]: Connection closed by invalid user debian 157.20.215.3 port 43110 [preauth]
Feb 23 10:17:52 compute-0 sshd-session[31980]: Invalid user debian from 157.20.215.3 port 43116
Feb 23 10:17:52 compute-0 sshd-session[31980]: Connection closed by invalid user debian 157.20.215.3 port 43116 [preauth]
Feb 23 10:17:54 compute-0 sshd-session[31982]: Invalid user debian from 157.20.215.3 port 43126
Feb 23 10:17:55 compute-0 sshd-session[31982]: Connection closed by invalid user debian 157.20.215.3 port 43126 [preauth]
Feb 23 10:17:57 compute-0 sshd-session[31985]: Invalid user debian from 157.20.215.3 port 43134
Feb 23 10:17:57 compute-0 sshd-session[31985]: Connection closed by invalid user debian 157.20.215.3 port 43134 [preauth]
Feb 23 10:18:00 compute-0 sshd-session[31987]: Invalid user debian from 157.20.215.3 port 54816
Feb 23 10:18:00 compute-0 sshd-session[31987]: Connection closed by invalid user debian 157.20.215.3 port 54816 [preauth]
Feb 23 10:18:02 compute-0 sshd-session[31989]: Invalid user debian from 157.20.215.3 port 54820
Feb 23 10:18:03 compute-0 sshd-session[31989]: Connection closed by invalid user debian 157.20.215.3 port 54820 [preauth]
Feb 23 10:18:05 compute-0 sshd-session[31991]: Invalid user debian from 157.20.215.3 port 54826
Feb 23 10:18:05 compute-0 sshd-session[31991]: Connection closed by invalid user debian 157.20.215.3 port 54826 [preauth]
Feb 23 10:18:07 compute-0 sshd-session[31993]: Invalid user debian from 157.20.215.3 port 46138
Feb 23 10:18:07 compute-0 sshd-session[31993]: Connection closed by invalid user debian 157.20.215.3 port 46138 [preauth]
Feb 23 10:18:09 compute-0 sshd-session[31996]: Invalid user debian from 157.20.215.3 port 46150
Feb 23 10:18:10 compute-0 sshd-session[31996]: Connection closed by invalid user debian 157.20.215.3 port 46150 [preauth]
Feb 23 10:18:12 compute-0 sshd-session[31998]: Invalid user debian from 157.20.215.3 port 46154
Feb 23 10:18:13 compute-0 sshd-session[31998]: Connection closed by invalid user debian 157.20.215.3 port 46154 [preauth]
Feb 23 10:18:14 compute-0 sshd-session[32000]: Invalid user debian from 157.20.215.3 port 46160
Feb 23 10:18:15 compute-0 sshd-session[32000]: Connection closed by invalid user debian 157.20.215.3 port 46160 [preauth]
Feb 23 10:18:17 compute-0 sshd-session[32002]: Invalid user debian from 157.20.215.3 port 46168
Feb 23 10:18:17 compute-0 sshd-session[32002]: Connection closed by invalid user debian 157.20.215.3 port 46168 [preauth]
Feb 23 10:18:19 compute-0 sshd-session[32004]: Invalid user debian from 157.20.215.3 port 43650
Feb 23 10:18:19 compute-0 sshd-session[32004]: Connection closed by invalid user debian 157.20.215.3 port 43650 [preauth]
Feb 23 10:18:21 compute-0 sshd-session[32006]: Invalid user debian from 157.20.215.3 port 43662
Feb 23 10:18:22 compute-0 sshd-session[32006]: Connection closed by invalid user debian 157.20.215.3 port 43662 [preauth]
Feb 23 10:18:24 compute-0 sshd-session[32008]: Invalid user debian from 157.20.215.3 port 43670
Feb 23 10:18:24 compute-0 sshd-session[32008]: Connection closed by invalid user debian 157.20.215.3 port 43670 [preauth]
Feb 23 10:18:26 compute-0 sshd-session[32010]: Invalid user debian from 157.20.215.3 port 43674
Feb 23 10:18:26 compute-0 sshd-session[32010]: Connection closed by invalid user debian 157.20.215.3 port 43674 [preauth]
Feb 23 10:18:28 compute-0 sshd-session[32012]: Invalid user debian from 157.20.215.3 port 33954
Feb 23 10:18:29 compute-0 sshd-session[32012]: Connection closed by invalid user debian 157.20.215.3 port 33954 [preauth]
Feb 23 10:18:30 compute-0 sshd-session[32014]: Invalid user debian from 157.20.215.3 port 33966
Feb 23 10:18:31 compute-0 sshd-session[32014]: Connection closed by invalid user debian 157.20.215.3 port 33966 [preauth]
Feb 23 10:18:33 compute-0 sshd-session[32016]: Invalid user debian from 157.20.215.3 port 33972
Feb 23 10:18:33 compute-0 sshd-session[32018]: Connection closed by authenticating user root 143.198.30.3 port 57336 [preauth]
Feb 23 10:18:33 compute-0 sshd-session[32016]: Connection closed by invalid user debian 157.20.215.3 port 33972 [preauth]
Feb 23 10:18:36 compute-0 sshd-session[32020]: Invalid user debian from 157.20.215.3 port 33980
Feb 23 10:18:36 compute-0 sshd-session[32020]: Connection closed by invalid user debian 157.20.215.3 port 33980 [preauth]
Feb 23 10:18:39 compute-0 sshd-session[32023]: Invalid user debian from 157.20.215.3 port 48774
Feb 23 10:18:39 compute-0 sshd-session[32023]: Connection closed by invalid user debian 157.20.215.3 port 48774 [preauth]
Feb 23 10:18:43 compute-0 sshd-session[32025]: Invalid user debian from 157.20.215.3 port 48776
Feb 23 10:18:43 compute-0 sshd-session[32025]: Connection closed by invalid user debian 157.20.215.3 port 48776 [preauth]
Feb 23 10:18:46 compute-0 sshd-session[32027]: Invalid user debian from 157.20.215.3 port 48788
Feb 23 10:18:46 compute-0 sshd-session[32027]: Connection closed by invalid user debian 157.20.215.3 port 48788 [preauth]
Feb 23 10:18:48 compute-0 sshd-session[32029]: Invalid user debian from 157.20.215.3 port 40378
Feb 23 10:18:48 compute-0 sshd-session[32029]: Connection closed by invalid user debian 157.20.215.3 port 40378 [preauth]
Feb 23 10:18:50 compute-0 sshd-session[32031]: Invalid user debian from 157.20.215.3 port 40394
Feb 23 10:18:51 compute-0 sshd-session[32031]: Connection closed by invalid user debian 157.20.215.3 port 40394 [preauth]
Feb 23 10:18:53 compute-0 sshd-session[32033]: Invalid user debian from 157.20.215.3 port 40398
Feb 23 10:18:53 compute-0 sshd-session[32033]: Connection closed by invalid user debian 157.20.215.3 port 40398 [preauth]
Feb 23 10:18:55 compute-0 sshd-session[32035]: Invalid user debian from 157.20.215.3 port 40404
Feb 23 10:18:55 compute-0 sshd-session[32035]: Connection closed by invalid user debian 157.20.215.3 port 40404 [preauth]
Feb 23 10:18:57 compute-0 sshd-session[32037]: Invalid user debian from 157.20.215.3 port 44534
Feb 23 10:18:58 compute-0 sshd-session[32037]: Connection closed by invalid user debian 157.20.215.3 port 44534 [preauth]
Feb 23 10:18:59 compute-0 sshd-session[32039]: Invalid user debian from 157.20.215.3 port 44544
Feb 23 10:19:00 compute-0 sshd-session[32039]: Connection closed by invalid user debian 157.20.215.3 port 44544 [preauth]
Feb 23 10:19:02 compute-0 sshd-session[32041]: Invalid user debian from 157.20.215.3 port 44554
Feb 23 10:19:02 compute-0 sshd-session[32041]: Connection closed by invalid user debian 157.20.215.3 port 44554 [preauth]
Feb 23 10:19:04 compute-0 sshd-session[32043]: Invalid user debian from 157.20.215.3 port 44558
Feb 23 10:19:04 compute-0 sshd-session[32043]: Connection closed by invalid user debian 157.20.215.3 port 44558 [preauth]
Feb 23 10:19:07 compute-0 sshd-session[32045]: Invalid user debian from 157.20.215.3 port 44574
Feb 23 10:19:07 compute-0 sshd-session[32045]: Connection closed by invalid user debian 157.20.215.3 port 44574 [preauth]
Feb 23 10:19:09 compute-0 sshd-session[32047]: Invalid user debian from 157.20.215.3 port 53096
Feb 23 10:19:09 compute-0 sshd-session[32047]: Connection closed by invalid user debian 157.20.215.3 port 53096 [preauth]
Feb 23 10:19:11 compute-0 sshd-session[32049]: Invalid user debian from 157.20.215.3 port 53104
Feb 23 10:19:11 compute-0 sshd-session[32049]: Connection closed by invalid user debian 157.20.215.3 port 53104 [preauth]
Feb 23 10:19:13 compute-0 sshd-session[32051]: Invalid user debian from 157.20.215.3 port 53120
Feb 23 10:19:14 compute-0 sshd-session[32051]: Connection closed by invalid user debian 157.20.215.3 port 53120 [preauth]
Feb 23 10:19:17 compute-0 sshd-session[32053]: Invalid user debian from 157.20.215.3 port 53126
Feb 23 10:19:18 compute-0 sshd-session[32053]: Connection closed by invalid user debian 157.20.215.3 port 53126 [preauth]
Feb 23 10:19:19 compute-0 sshd-session[32055]: Invalid user debian from 157.20.215.3 port 33346
Feb 23 10:19:19 compute-0 sshd-session[32055]: Connection closed by invalid user debian 157.20.215.3 port 33346 [preauth]
Feb 23 10:19:21 compute-0 sshd-session[32057]: Invalid user debian from 157.20.215.3 port 33354
Feb 23 10:19:22 compute-0 sshd-session[32057]: Connection closed by invalid user debian 157.20.215.3 port 33354 [preauth]
Feb 23 10:19:23 compute-0 sshd-session[32059]: Invalid user debian from 157.20.215.3 port 33370
Feb 23 10:19:24 compute-0 sshd-session[32059]: Connection closed by invalid user debian 157.20.215.3 port 33370 [preauth]
Feb 23 10:19:26 compute-0 sshd-session[32061]: Invalid user debian from 157.20.215.3 port 33384
Feb 23 10:19:26 compute-0 sshd-session[32061]: Connection closed by invalid user debian 157.20.215.3 port 33384 [preauth]
Feb 23 10:19:28 compute-0 sshd-session[32065]: error: kex_exchange_identification: read: Connection reset by peer
Feb 23 10:19:28 compute-0 sshd-session[32065]: Connection reset by 143.198.30.3 port 34016
Feb 23 10:19:28 compute-0 sshd-session[32063]: Invalid user debian from 157.20.215.3 port 45616
Feb 23 10:19:28 compute-0 sshd-session[32063]: Connection closed by invalid user debian 157.20.215.3 port 45616 [preauth]
Feb 23 10:19:29 compute-0 sshd-session[32066]: Connection closed by authenticating user root 143.198.30.3 port 34020 [preauth]
Feb 23 10:19:30 compute-0 sshd-session[32068]: Invalid user debian from 157.20.215.3 port 45628
Feb 23 10:19:30 compute-0 sshd-session[32068]: Connection closed by invalid user debian 157.20.215.3 port 45628 [preauth]
Feb 23 10:19:32 compute-0 sshd-session[32070]: Invalid user debian from 157.20.215.3 port 45636
Feb 23 10:19:33 compute-0 sshd-session[32070]: Connection closed by invalid user debian 157.20.215.3 port 45636 [preauth]
Feb 23 10:19:35 compute-0 sshd-session[32072]: Invalid user debian from 157.20.215.3 port 45648
Feb 23 10:19:35 compute-0 sshd-session[32072]: Connection closed by invalid user debian 157.20.215.3 port 45648 [preauth]
Feb 23 10:19:37 compute-0 sshd-session[32074]: Invalid user debian from 157.20.215.3 port 45654
Feb 23 10:19:37 compute-0 sshd-session[32074]: Connection closed by invalid user debian 157.20.215.3 port 45654 [preauth]
Feb 23 10:19:39 compute-0 sshd-session[32076]: Invalid user debian from 157.20.215.3 port 36962
Feb 23 10:19:39 compute-0 sshd-session[32076]: Connection closed by invalid user debian 157.20.215.3 port 36962 [preauth]
Feb 23 10:19:41 compute-0 sshd-session[32078]: Invalid user debian from 157.20.215.3 port 36974
Feb 23 10:19:42 compute-0 sshd-session[32078]: Connection closed by invalid user debian 157.20.215.3 port 36974 [preauth]
Feb 23 10:19:44 compute-0 sshd-session[32080]: Invalid user debian from 157.20.215.3 port 36976
Feb 23 10:19:44 compute-0 sshd-session[32080]: Connection closed by invalid user debian 157.20.215.3 port 36976 [preauth]
Feb 23 10:19:46 compute-0 sshd-session[32082]: Invalid user debian from 157.20.215.3 port 36984
Feb 23 10:19:46 compute-0 sshd-session[32082]: Connection closed by invalid user debian 157.20.215.3 port 36984 [preauth]
Feb 23 10:19:48 compute-0 sshd-session[32084]: Invalid user debian from 157.20.215.3 port 35066
Feb 23 10:19:48 compute-0 sshd-session[32084]: Connection closed by invalid user debian 157.20.215.3 port 35066 [preauth]
Feb 23 10:19:50 compute-0 sshd-session[32086]: Invalid user debian from 157.20.215.3 port 35076
Feb 23 10:19:51 compute-0 sshd-session[32086]: Connection closed by invalid user debian 157.20.215.3 port 35076 [preauth]
Feb 23 10:19:53 compute-0 sshd-session[32088]: Invalid user debian from 157.20.215.3 port 35078
Feb 23 10:19:53 compute-0 sshd-session[32088]: Connection closed by invalid user debian 157.20.215.3 port 35078 [preauth]
Feb 23 10:19:55 compute-0 sshd-session[32090]: Invalid user debian from 157.20.215.3 port 35084
Feb 23 10:19:55 compute-0 sshd-session[32090]: Connection closed by invalid user debian 157.20.215.3 port 35084 [preauth]
Feb 23 10:19:57 compute-0 sshd-session[32092]: Invalid user debian from 157.20.215.3 port 40380
Feb 23 10:19:58 compute-0 sshd-session[32092]: Connection closed by invalid user debian 157.20.215.3 port 40380 [preauth]
Feb 23 10:20:01 compute-0 sshd-session[32094]: Invalid user debian from 157.20.215.3 port 40388
Feb 23 10:20:01 compute-0 sshd-session[32094]: Connection closed by invalid user debian 157.20.215.3 port 40388 [preauth]
Feb 23 10:20:03 compute-0 sshd-session[32096]: Invalid user debian from 157.20.215.3 port 40392
Feb 23 10:20:03 compute-0 sshd-session[32096]: Connection closed by invalid user debian 157.20.215.3 port 40392 [preauth]
Feb 23 10:20:05 compute-0 sshd-session[32098]: Invalid user debian from 157.20.215.3 port 40404
Feb 23 10:20:06 compute-0 sshd-session[32098]: Connection closed by invalid user debian 157.20.215.3 port 40404 [preauth]
Feb 23 10:20:07 compute-0 sshd-session[32100]: Invalid user debian from 157.20.215.3 port 32770
Feb 23 10:20:08 compute-0 sshd-session[32100]: Connection closed by invalid user debian 157.20.215.3 port 32770 [preauth]
Feb 23 10:20:10 compute-0 sshd-session[32102]: Invalid user debian from 157.20.215.3 port 32786
Feb 23 10:20:10 compute-0 sshd-session[32102]: Connection closed by invalid user debian 157.20.215.3 port 32786 [preauth]
Feb 23 10:20:12 compute-0 sshd-session[32104]: Invalid user debian from 157.20.215.3 port 32796
Feb 23 10:20:12 compute-0 sshd-session[32104]: Connection closed by invalid user debian 157.20.215.3 port 32796 [preauth]
Feb 23 10:20:14 compute-0 sshd-session[32106]: Invalid user debian from 157.20.215.3 port 32800
Feb 23 10:20:15 compute-0 sshd-session[32106]: Connection closed by invalid user debian 157.20.215.3 port 32800 [preauth]
Feb 23 10:20:16 compute-0 sshd-session[32108]: Invalid user debian from 157.20.215.3 port 32812
Feb 23 10:20:17 compute-0 sshd-session[32108]: Connection closed by invalid user debian 157.20.215.3 port 32812 [preauth]
Feb 23 10:20:19 compute-0 sshd-session[32110]: Invalid user debian from 157.20.215.3 port 36490
Feb 23 10:20:19 compute-0 sshd-session[32110]: Connection closed by invalid user debian 157.20.215.3 port 36490 [preauth]
Feb 23 10:20:22 compute-0 sshd-session[32112]: Invalid user debian from 157.20.215.3 port 36496
Feb 23 10:20:23 compute-0 sshd-session[32112]: Connection closed by invalid user debian 157.20.215.3 port 36496 [preauth]
Feb 23 10:20:24 compute-0 sshd-session[32114]: Invalid user debian from 157.20.215.3 port 36498
Feb 23 10:20:25 compute-0 sshd-session[32114]: Connection closed by invalid user debian 157.20.215.3 port 36498 [preauth]
Feb 23 10:20:28 compute-0 sshd-session[32116]: Invalid user debian from 157.20.215.3 port 36500
Feb 23 10:20:28 compute-0 sshd-session[32116]: Connection closed by invalid user debian 157.20.215.3 port 36500 [preauth]
Feb 23 10:20:30 compute-0 sshd-session[32118]: Invalid user debian from 157.20.215.3 port 32954
Feb 23 10:20:30 compute-0 sshd-session[32118]: Connection closed by invalid user debian 157.20.215.3 port 32954 [preauth]
Feb 23 10:20:32 compute-0 sshd-session[32120]: Invalid user debian from 157.20.215.3 port 32968
Feb 23 10:20:33 compute-0 sshd-session[32120]: Connection closed by invalid user debian 157.20.215.3 port 32968 [preauth]
Feb 23 10:20:34 compute-0 sshd-session[32124]: error: kex_exchange_identification: read: Connection reset by peer
Feb 23 10:20:34 compute-0 sshd-session[32124]: Connection reset by 143.198.30.3 port 56978
Feb 23 10:20:34 compute-0 sshd-session[32125]: Connection closed by authenticating user root 143.198.30.3 port 56984 [preauth]
Feb 23 10:20:35 compute-0 sshd-session[32122]: Invalid user debian from 157.20.215.3 port 32982
Feb 23 10:20:35 compute-0 sshd-session[32122]: Connection closed by invalid user debian 157.20.215.3 port 32982 [preauth]
Feb 23 10:20:37 compute-0 sshd-session[32127]: Invalid user admin from 157.20.215.3 port 50430
Feb 23 10:20:37 compute-0 sshd-session[32127]: Connection closed by invalid user admin 157.20.215.3 port 50430 [preauth]
Feb 23 10:20:39 compute-0 sshd-session[32129]: Invalid user admin from 157.20.215.3 port 50438
Feb 23 10:20:40 compute-0 sshd-session[32129]: Connection closed by invalid user admin 157.20.215.3 port 50438 [preauth]
Feb 23 10:20:42 compute-0 sshd-session[32131]: Invalid user admin from 157.20.215.3 port 50452
Feb 23 10:20:42 compute-0 sshd-session[32131]: Connection closed by invalid user admin 157.20.215.3 port 50452 [preauth]
Feb 23 10:20:44 compute-0 sshd-session[32133]: Invalid user admin from 157.20.215.3 port 50468
Feb 23 10:20:45 compute-0 sshd-session[32133]: Connection closed by invalid user admin 157.20.215.3 port 50468 [preauth]
Feb 23 10:20:47 compute-0 sshd-session[32135]: Invalid user admin from 157.20.215.3 port 50480
Feb 23 10:20:47 compute-0 sshd-session[32135]: Connection closed by invalid user admin 157.20.215.3 port 50480 [preauth]
Feb 23 10:20:49 compute-0 sshd-session[32137]: Invalid user admin from 157.20.215.3 port 39094
Feb 23 10:20:49 compute-0 sshd-session[32137]: Connection closed by invalid user admin 157.20.215.3 port 39094 [preauth]
Feb 23 10:20:51 compute-0 sshd-session[32139]: Invalid user admin from 157.20.215.3 port 39102
Feb 23 10:20:52 compute-0 sshd-session[32139]: Connection closed by invalid user admin 157.20.215.3 port 39102 [preauth]
Feb 23 10:20:54 compute-0 sshd-session[32141]: Invalid user admin from 157.20.215.3 port 39104
Feb 23 10:20:55 compute-0 sshd-session[32141]: Connection closed by invalid user admin 157.20.215.3 port 39104 [preauth]
Feb 23 10:20:57 compute-0 sshd-session[32143]: Invalid user admin from 157.20.215.3 port 39120
Feb 23 10:20:57 compute-0 sshd-session[32143]: Connection closed by invalid user admin 157.20.215.3 port 39120 [preauth]
Feb 23 10:20:59 compute-0 sshd-session[32145]: Invalid user admin from 157.20.215.3 port 47932
Feb 23 10:20:59 compute-0 sshd-session[32145]: Connection closed by invalid user admin 157.20.215.3 port 47932 [preauth]
Feb 23 10:21:01 compute-0 sshd-session[32147]: Invalid user admin from 157.20.215.3 port 47938
Feb 23 10:21:02 compute-0 sshd-session[32147]: Connection closed by invalid user admin 157.20.215.3 port 47938 [preauth]
Feb 23 10:21:03 compute-0 sshd-session[32149]: Invalid user admin from 157.20.215.3 port 47952
Feb 23 10:21:04 compute-0 sshd-session[32149]: Connection closed by invalid user admin 157.20.215.3 port 47952 [preauth]
Feb 23 10:21:06 compute-0 sshd-session[32151]: Invalid user admin from 157.20.215.3 port 47974
Feb 23 10:21:06 compute-0 sshd-session[32151]: Connection closed by invalid user admin 157.20.215.3 port 47974 [preauth]
Feb 23 10:21:08 compute-0 sshd-session[32153]: Invalid user admin from 157.20.215.3 port 46196
Feb 23 10:21:08 compute-0 sshd-session[32153]: Connection closed by invalid user admin 157.20.215.3 port 46196 [preauth]
Feb 23 10:21:10 compute-0 sshd-session[32155]: Invalid user admin from 157.20.215.3 port 46212
Feb 23 10:21:11 compute-0 sshd-session[32155]: Connection closed by invalid user admin 157.20.215.3 port 46212 [preauth]
Feb 23 10:21:12 compute-0 sshd-session[32157]: Invalid user admin from 157.20.215.3 port 46216
Feb 23 10:21:13 compute-0 sshd-session[32157]: Connection closed by invalid user admin 157.20.215.3 port 46216 [preauth]
Feb 23 10:21:15 compute-0 sshd-session[32159]: Invalid user admin from 157.20.215.3 port 46226
Feb 23 10:21:15 compute-0 sshd-session[32159]: Connection closed by invalid user admin 157.20.215.3 port 46226 [preauth]
Feb 23 10:21:17 compute-0 sshd-session[32161]: Invalid user admin from 157.20.215.3 port 46232
Feb 23 10:21:17 compute-0 sshd-session[32161]: Connection closed by invalid user admin 157.20.215.3 port 46232 [preauth]
Feb 23 10:21:19 compute-0 sshd-session[32163]: Invalid user admin from 157.20.215.3 port 50896
Feb 23 10:21:19 compute-0 sshd-session[32163]: Connection closed by invalid user admin 157.20.215.3 port 50896 [preauth]
Feb 23 10:21:21 compute-0 sshd-session[32167]: Connection closed by 143.198.30.3 port 40022
Feb 23 10:21:21 compute-0 sshd-session[32168]: Connection closed by authenticating user root 143.198.30.3 port 40032 [preauth]
Feb 23 10:21:21 compute-0 sshd-session[32165]: Invalid user admin from 157.20.215.3 port 50908
Feb 23 10:21:23 compute-0 sshd-session[32165]: Connection closed by invalid user admin 157.20.215.3 port 50908 [preauth]
Feb 23 10:21:24 compute-0 sshd-session[32170]: Invalid user admin from 157.20.215.3 port 50918
Feb 23 10:21:24 compute-0 sshd-session[32170]: Connection closed by invalid user admin 157.20.215.3 port 50918 [preauth]
Feb 23 10:21:26 compute-0 sshd-session[32172]: Invalid user admin from 157.20.215.3 port 50922
Feb 23 10:21:26 compute-0 sshd-session[32172]: Connection closed by invalid user admin 157.20.215.3 port 50922 [preauth]
Feb 23 10:21:28 compute-0 sshd-session[32174]: Invalid user admin from 157.20.215.3 port 35344
Feb 23 10:21:28 compute-0 sshd-session[32174]: Connection closed by invalid user admin 157.20.215.3 port 35344 [preauth]
Feb 23 10:21:30 compute-0 sshd-session[32176]: Invalid user admin from 157.20.215.3 port 35348
Feb 23 10:21:31 compute-0 sshd-session[32176]: Connection closed by invalid user admin 157.20.215.3 port 35348 [preauth]
Feb 23 10:21:32 compute-0 sshd-session[32178]: Invalid user admin from 157.20.215.3 port 35356
Feb 23 10:21:33 compute-0 sshd-session[32178]: Connection closed by invalid user admin 157.20.215.3 port 35356 [preauth]
Feb 23 10:21:35 compute-0 sshd-session[32180]: Invalid user admin from 157.20.215.3 port 35368
Feb 23 10:21:35 compute-0 sshd-session[32180]: Connection closed by invalid user admin 157.20.215.3 port 35368 [preauth]
Feb 23 10:21:35 compute-0 sshd-session[32182]: Connection closed by authenticating user root 165.227.79.48 port 53086 [preauth]
Feb 23 10:21:37 compute-0 sshd-session[32184]: Invalid user admin from 157.20.215.3 port 33594
Feb 23 10:21:37 compute-0 sshd-session[32184]: Connection closed by invalid user admin 157.20.215.3 port 33594 [preauth]
Feb 23 10:21:39 compute-0 sshd-session[32186]: Invalid user admin from 157.20.215.3 port 33600
Feb 23 10:21:40 compute-0 sshd-session[32186]: Connection closed by invalid user admin 157.20.215.3 port 33600 [preauth]
Feb 23 10:21:41 compute-0 sshd-session[32188]: Invalid user admin from 157.20.215.3 port 33610
Feb 23 10:21:42 compute-0 sshd-session[32188]: Connection closed by invalid user admin 157.20.215.3 port 33610 [preauth]
Feb 23 10:21:44 compute-0 sshd-session[32190]: Invalid user admin from 157.20.215.3 port 33614
Feb 23 10:21:44 compute-0 sshd-session[32190]: Connection closed by invalid user admin 157.20.215.3 port 33614 [preauth]
Feb 23 10:21:46 compute-0 sshd-session[32192]: Invalid user admin from 157.20.215.3 port 33620
Feb 23 10:21:46 compute-0 sshd-session[32192]: Connection closed by invalid user admin 157.20.215.3 port 33620 [preauth]
Feb 23 10:21:48 compute-0 sshd-session[32194]: Invalid user admin from 157.20.215.3 port 48950
Feb 23 10:21:48 compute-0 sshd-session[32194]: Connection closed by invalid user admin 157.20.215.3 port 48950 [preauth]
Feb 23 10:21:50 compute-0 sshd-session[32196]: Invalid user admin from 157.20.215.3 port 48956
Feb 23 10:21:50 compute-0 sshd-session[32196]: Connection closed by invalid user admin 157.20.215.3 port 48956 [preauth]
Feb 23 10:21:52 compute-0 sshd-session[32198]: Invalid user admin from 157.20.215.3 port 48960
Feb 23 10:21:52 compute-0 sshd-session[32198]: Connection closed by invalid user admin 157.20.215.3 port 48960 [preauth]
Feb 23 10:21:54 compute-0 sshd-session[32200]: Invalid user admin from 157.20.215.3 port 48964
Feb 23 10:21:55 compute-0 sshd-session[32200]: Connection closed by invalid user admin 157.20.215.3 port 48964 [preauth]
Feb 23 10:21:57 compute-0 sshd-session[32202]: Invalid user admin from 157.20.215.3 port 48978
Feb 23 10:21:57 compute-0 sshd-session[32202]: Connection closed by invalid user admin 157.20.215.3 port 48978 [preauth]
Feb 23 10:21:57 compute-0 sshd-session[32206]: Connection closed by authenticating user root 143.198.30.3 port 55422 [preauth]
Feb 23 10:21:59 compute-0 sshd-session[32204]: Invalid user admin from 157.20.215.3 port 40984
Feb 23 10:21:59 compute-0 sshd-session[32204]: Connection closed by invalid user admin 157.20.215.3 port 40984 [preauth]
Feb 23 10:22:01 compute-0 sshd-session[32208]: Invalid user admin from 157.20.215.3 port 40986
Feb 23 10:22:01 compute-0 sshd-session[32208]: Connection closed by invalid user admin 157.20.215.3 port 40986 [preauth]
Feb 23 10:22:03 compute-0 sshd-session[32210]: Invalid user admin from 157.20.215.3 port 40996
Feb 23 10:22:04 compute-0 sshd-session[32210]: Connection closed by invalid user admin 157.20.215.3 port 40996 [preauth]
Feb 23 10:22:05 compute-0 sshd-session[32212]: Invalid user admin from 157.20.215.3 port 41012
Feb 23 10:22:06 compute-0 sshd-session[32212]: Connection closed by invalid user admin 157.20.215.3 port 41012 [preauth]
Feb 23 10:22:08 compute-0 sshd-session[32214]: Invalid user admin from 157.20.215.3 port 41472
Feb 23 10:22:08 compute-0 sshd-session[32214]: Connection closed by invalid user admin 157.20.215.3 port 41472 [preauth]
Feb 23 10:22:10 compute-0 sshd-session[32216]: Invalid user admin from 157.20.215.3 port 41480
Feb 23 10:22:10 compute-0 sshd-session[32216]: Connection closed by invalid user admin 157.20.215.3 port 41480 [preauth]
Feb 23 10:22:12 compute-0 sshd-session[32218]: Invalid user admin from 157.20.215.3 port 41488
Feb 23 10:22:12 compute-0 sshd-session[32218]: Connection closed by invalid user admin 157.20.215.3 port 41488 [preauth]
Feb 23 10:22:14 compute-0 sshd-session[32220]: Invalid user admin from 157.20.215.3 port 41504
Feb 23 10:22:14 compute-0 sshd-session[32220]: Connection closed by invalid user admin 157.20.215.3 port 41504 [preauth]
Feb 23 10:22:16 compute-0 sshd-session[32222]: Invalid user admin from 157.20.215.3 port 41506
Feb 23 10:22:17 compute-0 sshd-session[32222]: Connection closed by invalid user admin 157.20.215.3 port 41506 [preauth]
Feb 23 10:22:18 compute-0 sshd-session[32224]: Invalid user admin from 157.20.215.3 port 39856
Feb 23 10:22:19 compute-0 sshd-session[32224]: Connection closed by invalid user admin 157.20.215.3 port 39856 [preauth]
Feb 23 10:22:20 compute-0 sshd-session[32226]: Invalid user admin from 157.20.215.3 port 39864
Feb 23 10:22:21 compute-0 sshd-session[32226]: Connection closed by invalid user admin 157.20.215.3 port 39864 [preauth]
Feb 23 10:22:22 compute-0 sshd-session[32228]: Invalid user admin from 157.20.215.3 port 39872
Feb 23 10:22:23 compute-0 sshd-session[32228]: Connection closed by invalid user admin 157.20.215.3 port 39872 [preauth]
Feb 23 10:22:25 compute-0 sshd-session[32230]: Invalid user admin from 157.20.215.3 port 39886
Feb 23 10:22:25 compute-0 sshd-session[32230]: Connection closed by invalid user admin 157.20.215.3 port 39886 [preauth]
Feb 23 10:22:27 compute-0 sshd-session[32232]: Invalid user admin from 157.20.215.3 port 39892
Feb 23 10:22:27 compute-0 sshd-session[32232]: Connection closed by invalid user admin 157.20.215.3 port 39892 [preauth]
Feb 23 10:22:29 compute-0 sshd-session[32234]: Invalid user admin from 157.20.215.3 port 52496
Feb 23 10:22:29 compute-0 sshd-session[32234]: Connection closed by invalid user admin 157.20.215.3 port 52496 [preauth]
Feb 23 10:22:31 compute-0 sshd-session[32236]: Invalid user admin from 157.20.215.3 port 52504
Feb 23 10:22:32 compute-0 sshd-session[32236]: Connection closed by invalid user admin 157.20.215.3 port 52504 [preauth]
Feb 23 10:22:32 compute-0 sshd-session[32240]: Connection closed by authenticating user root 165.227.79.48 port 32858 [preauth]
Feb 23 10:22:33 compute-0 sshd-session[32238]: Invalid user admin from 157.20.215.3 port 52512
Feb 23 10:22:34 compute-0 sshd-session[32238]: Connection closed by invalid user admin 157.20.215.3 port 52512 [preauth]
Feb 23 10:22:35 compute-0 sshd-session[32244]: Connection closed by authenticating user root 143.198.30.3 port 59110 [preauth]
Feb 23 10:22:36 compute-0 sshd-session[32242]: Invalid user admin from 157.20.215.3 port 52528
Feb 23 10:22:36 compute-0 sshd-session[32242]: Connection closed by invalid user admin 157.20.215.3 port 52528 [preauth]
Feb 23 10:22:37 compute-0 sshd-session[30989]: Received disconnect from 38.102.83.129 port 35548:11: disconnected by user
Feb 23 10:22:37 compute-0 sshd-session[30989]: Disconnected from user zuul 38.102.83.129 port 35548
Feb 23 10:22:37 compute-0 sshd-session[30986]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:22:37 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Feb 23 10:22:37 compute-0 systemd[1]: session-7.scope: Consumed 3.787s CPU time.
Feb 23 10:22:37 compute-0 systemd-logind[808]: Session 7 logged out. Waiting for processes to exit.
Feb 23 10:22:37 compute-0 systemd-logind[808]: Removed session 7.
Feb 23 10:22:38 compute-0 sshd-session[32246]: Invalid user admin from 157.20.215.3 port 34892
Feb 23 10:22:38 compute-0 sshd-session[32246]: Connection closed by invalid user admin 157.20.215.3 port 34892 [preauth]
Feb 23 10:22:40 compute-0 sshd-session[32249]: Invalid user admin from 157.20.215.3 port 34902
Feb 23 10:22:40 compute-0 sshd-session[32249]: Connection closed by invalid user admin 157.20.215.3 port 34902 [preauth]
Feb 23 10:22:42 compute-0 sshd-session[32251]: Invalid user admin from 157.20.215.3 port 34904
Feb 23 10:22:43 compute-0 sshd-session[32251]: Connection closed by invalid user admin 157.20.215.3 port 34904 [preauth]
Feb 23 10:22:44 compute-0 sshd-session[32253]: Invalid user admin from 157.20.215.3 port 34912
Feb 23 10:22:45 compute-0 sshd-session[32253]: Connection closed by invalid user admin 157.20.215.3 port 34912 [preauth]
Feb 23 10:22:47 compute-0 sshd-session[32255]: Invalid user admin from 157.20.215.3 port 34918
Feb 23 10:22:47 compute-0 sshd-session[32255]: Connection closed by invalid user admin 157.20.215.3 port 34918 [preauth]
Feb 23 10:22:49 compute-0 sshd-session[32257]: Invalid user admin from 157.20.215.3 port 60768
Feb 23 10:22:49 compute-0 sshd-session[32257]: Connection closed by invalid user admin 157.20.215.3 port 60768 [preauth]
Feb 23 10:22:51 compute-0 sshd-session[32259]: Invalid user admin from 157.20.215.3 port 60776
Feb 23 10:22:51 compute-0 sshd-session[32259]: Connection closed by invalid user admin 157.20.215.3 port 60776 [preauth]
Feb 23 10:22:53 compute-0 sshd-session[32261]: Invalid user admin from 157.20.215.3 port 60780
Feb 23 10:22:53 compute-0 sshd-session[32261]: Connection closed by invalid user admin 157.20.215.3 port 60780 [preauth]
Feb 23 10:22:55 compute-0 sshd-session[32264]: Invalid user admin from 157.20.215.3 port 60792
Feb 23 10:22:55 compute-0 sshd-session[32264]: Connection closed by invalid user admin 157.20.215.3 port 60792 [preauth]
Feb 23 10:22:57 compute-0 sshd-session[32266]: Invalid user admin from 157.20.215.3 port 55614
Feb 23 10:22:58 compute-0 sshd-session[32266]: Connection closed by invalid user admin 157.20.215.3 port 55614 [preauth]
Feb 23 10:22:59 compute-0 sshd-session[32268]: Invalid user admin from 157.20.215.3 port 55618
Feb 23 10:23:00 compute-0 sshd-session[32268]: Connection closed by invalid user admin 157.20.215.3 port 55618 [preauth]
Feb 23 10:23:01 compute-0 sshd-session[32270]: Invalid user admin from 157.20.215.3 port 55624
Feb 23 10:23:02 compute-0 sshd-session[32270]: Connection closed by invalid user admin 157.20.215.3 port 55624 [preauth]
Feb 23 10:23:07 compute-0 sshd-session[32272]: Invalid user admin from 157.20.215.3 port 55640
Feb 23 10:23:07 compute-0 sshd-session[32272]: Connection closed by invalid user admin 157.20.215.3 port 55640 [preauth]
Feb 23 10:23:09 compute-0 sshd-session[32274]: Invalid user admin from 157.20.215.3 port 53562
Feb 23 10:23:09 compute-0 sshd-session[32274]: Connection closed by invalid user admin 157.20.215.3 port 53562 [preauth]
Feb 23 10:23:11 compute-0 sshd-session[32276]: Invalid user admin from 157.20.215.3 port 53564
Feb 23 10:23:12 compute-0 sshd-session[32276]: Connection closed by invalid user admin 157.20.215.3 port 53564 [preauth]
Feb 23 10:23:12 compute-0 sshd-session[32278]: Connection closed by authenticating user root 143.198.30.3 port 53336 [preauth]
Feb 23 10:23:14 compute-0 sshd-session[32280]: Invalid user admin from 157.20.215.3 port 53576
Feb 23 10:23:14 compute-0 sshd-session[32280]: Connection closed by invalid user admin 157.20.215.3 port 53576 [preauth]
Feb 23 10:23:16 compute-0 sshd-session[32282]: Invalid user admin from 157.20.215.3 port 53582
Feb 23 10:23:16 compute-0 sshd-session[32282]: Connection closed by invalid user admin 157.20.215.3 port 53582 [preauth]
Feb 23 10:23:18 compute-0 sshd-session[32284]: Invalid user admin from 157.20.215.3 port 60256
Feb 23 10:23:18 compute-0 sshd-session[32284]: Connection closed by invalid user admin 157.20.215.3 port 60256 [preauth]
Feb 23 10:23:20 compute-0 sshd-session[32286]: Invalid user admin from 157.20.215.3 port 60264
Feb 23 10:23:21 compute-0 sshd-session[32286]: Connection closed by invalid user admin 157.20.215.3 port 60264 [preauth]
Feb 23 10:23:22 compute-0 sshd-session[32288]: Invalid user admin from 157.20.215.3 port 60272
Feb 23 10:23:23 compute-0 sshd-session[32288]: Connection closed by invalid user admin 157.20.215.3 port 60272 [preauth]
Feb 23 10:23:25 compute-0 sshd-session[32290]: Invalid user admin from 157.20.215.3 port 60280
Feb 23 10:23:25 compute-0 sshd-session[32290]: Connection closed by invalid user admin 157.20.215.3 port 60280 [preauth]
Feb 23 10:23:27 compute-0 sshd-session[32292]: Invalid user admin from 157.20.215.3 port 39122
Feb 23 10:23:27 compute-0 sshd-session[32292]: Connection closed by invalid user admin 157.20.215.3 port 39122 [preauth]
Feb 23 10:23:27 compute-0 sshd-session[32294]: Connection closed by authenticating user root 165.227.79.48 port 48902 [preauth]
Feb 23 10:23:29 compute-0 sshd-session[32296]: Invalid user admin from 157.20.215.3 port 39134
Feb 23 10:23:29 compute-0 sshd-session[32296]: Connection closed by invalid user admin 157.20.215.3 port 39134 [preauth]
Feb 23 10:23:31 compute-0 sshd-session[32298]: Invalid user admin from 157.20.215.3 port 39166
Feb 23 10:23:31 compute-0 sshd-session[32298]: Connection closed by invalid user admin 157.20.215.3 port 39166 [preauth]
Feb 23 10:23:33 compute-0 sshd-session[32300]: Invalid user admin from 157.20.215.3 port 39182
Feb 23 10:23:34 compute-0 sshd-session[32300]: Connection closed by invalid user admin 157.20.215.3 port 39182 [preauth]
Feb 23 10:23:36 compute-0 sshd-session[32302]: Invalid user admin from 157.20.215.3 port 39200
Feb 23 10:23:36 compute-0 sshd-session[32302]: Connection closed by invalid user admin 157.20.215.3 port 39200 [preauth]
Feb 23 10:23:38 compute-0 sshd-session[32304]: Invalid user admin from 157.20.215.3 port 56296
Feb 23 10:23:38 compute-0 sshd-session[32304]: Connection closed by invalid user admin 157.20.215.3 port 56296 [preauth]
Feb 23 10:23:40 compute-0 sshd-session[32306]: Invalid user admin from 157.20.215.3 port 56298
Feb 23 10:23:41 compute-0 sshd-session[32306]: Connection closed by invalid user admin 157.20.215.3 port 56298 [preauth]
Feb 23 10:23:43 compute-0 sshd-session[32308]: Invalid user pi from 157.20.215.3 port 56306
Feb 23 10:23:43 compute-0 sshd-session[32308]: Connection closed by invalid user pi 157.20.215.3 port 56306 [preauth]
Feb 23 10:23:45 compute-0 sshd-session[32310]: Connection closed by authenticating user ftp 157.20.215.3 port 56310 [preauth]
Feb 23 10:23:47 compute-0 sshd-session[32312]: Connection closed by authenticating user root 143.198.30.3 port 54112 [preauth]
Feb 23 10:24:21 compute-0 sshd-session[32315]: Connection closed by authenticating user root 165.227.79.48 port 59886 [preauth]
Feb 23 10:24:23 compute-0 sshd-session[32317]: Connection closed by authenticating user root 143.198.30.3 port 47672 [preauth]
Feb 23 10:24:58 compute-0 sshd-session[32323]: Connection closed by authenticating user root 143.198.30.3 port 58466 [preauth]
Feb 23 10:25:14 compute-0 sshd-session[32325]: Connection closed by authenticating user root 165.227.79.48 port 35898 [preauth]
Feb 23 10:25:34 compute-0 sshd-session[32327]: Connection closed by authenticating user root 143.198.30.3 port 49958 [preauth]
Feb 23 10:26:05 compute-0 sshd-session[32329]: Connection closed by authenticating user root 165.227.79.48 port 42682 [preauth]
Feb 23 10:26:10 compute-0 sshd-session[32331]: Connection closed by authenticating user root 143.198.30.3 port 56588 [preauth]
Feb 23 10:26:33 compute-0 sshd-session[32333]: Invalid user ubnt from 185.156.73.233 port 48830
Feb 23 10:26:33 compute-0 sshd-session[32333]: Connection closed by invalid user ubnt 185.156.73.233 port 48830 [preauth]
Feb 23 10:26:44 compute-0 sshd-session[32336]: Connection closed by authenticating user root 143.198.30.3 port 36274 [preauth]
Feb 23 10:26:53 compute-0 sshd-session[32338]: Connection closed by authenticating user root 165.227.79.48 port 36970 [preauth]
Feb 23 10:27:20 compute-0 sshd-session[32340]: Connection closed by authenticating user root 143.198.30.3 port 43492 [preauth]
Feb 23 10:27:43 compute-0 sshd-session[32342]: Connection closed by authenticating user root 165.227.79.48 port 53030 [preauth]
Feb 23 10:27:55 compute-0 sshd-session[32344]: Connection closed by authenticating user root 143.198.30.3 port 56862 [preauth]
Feb 23 10:28:30 compute-0 sshd-session[32347]: Connection closed by authenticating user root 143.198.30.3 port 51332 [preauth]
Feb 23 10:28:35 compute-0 sshd-session[32349]: Connection closed by authenticating user root 165.227.79.48 port 33576 [preauth]
Feb 23 10:29:05 compute-0 sshd-session[32351]: Connection closed by authenticating user root 143.198.30.3 port 50986 [preauth]
Feb 23 10:29:24 compute-0 sshd-session[32353]: Connection closed by authenticating user root 165.227.79.48 port 45114 [preauth]
Feb 23 10:29:38 compute-0 sshd-session[32355]: Connection closed by authenticating user root 143.198.30.3 port 35690 [preauth]
Feb 23 10:30:11 compute-0 sshd-session[32358]: Accepted publickey for zuul from 192.168.122.30 port 59254 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:30:11 compute-0 systemd-logind[808]: New session 8 of user zuul.
Feb 23 10:30:11 compute-0 systemd[1]: Started Session 8 of User zuul.
Feb 23 10:30:11 compute-0 sshd-session[32358]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:30:12 compute-0 python3.9[32511]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:30:13 compute-0 sudo[32690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xerongpwnrunxldyjnborajujdzsskig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842612.8373709-39-123885270545815/AnsiballZ_command.py'
Feb 23 10:30:13 compute-0 sudo[32690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:13 compute-0 python3.9[32693]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:30:14 compute-0 sshd-session[32704]: Connection closed by authenticating user root 165.227.79.48 port 50030 [preauth]
Feb 23 10:30:15 compute-0 sshd-session[32706]: Connection closed by authenticating user root 143.198.30.3 port 52780 [preauth]
Feb 23 10:30:20 compute-0 sudo[32690]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:20 compute-0 sshd-session[32361]: Connection closed by 192.168.122.30 port 59254
Feb 23 10:30:20 compute-0 sshd-session[32358]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:30:20 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Feb 23 10:30:20 compute-0 systemd[1]: session-8.scope: Consumed 7.183s CPU time.
Feb 23 10:30:20 compute-0 systemd-logind[808]: Session 8 logged out. Waiting for processes to exit.
Feb 23 10:30:20 compute-0 systemd-logind[808]: Removed session 8.
Feb 23 10:30:25 compute-0 sshd-session[32754]: Accepted publickey for zuul from 192.168.122.30 port 58600 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:30:25 compute-0 systemd-logind[808]: New session 9 of user zuul.
Feb 23 10:30:25 compute-0 systemd[1]: Started Session 9 of User zuul.
Feb 23 10:30:25 compute-0 sshd-session[32754]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:30:26 compute-0 python3.9[32907]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:30:27 compute-0 sshd-session[32757]: Connection closed by 192.168.122.30 port 58600
Feb 23 10:30:27 compute-0 sshd-session[32754]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:30:27 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Feb 23 10:30:27 compute-0 systemd-logind[808]: Session 9 logged out. Waiting for processes to exit.
Feb 23 10:30:27 compute-0 systemd-logind[808]: Removed session 9.
Feb 23 10:30:42 compute-0 sshd-session[32935]: Accepted publickey for zuul from 192.168.122.30 port 45672 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:30:42 compute-0 systemd-logind[808]: New session 10 of user zuul.
Feb 23 10:30:42 compute-0 systemd[1]: Started Session 10 of User zuul.
Feb 23 10:30:43 compute-0 sshd-session[32935]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:30:43 compute-0 python3.9[33088]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 23 10:30:45 compute-0 python3.9[33262]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:30:45 compute-0 sudo[33412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txlxulzthtgettskfxqrowmkqwilhtbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842645.372421-64-133316106816066/AnsiballZ_command.py'
Feb 23 10:30:45 compute-0 sudo[33412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:45 compute-0 python3.9[33415]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:30:45 compute-0 sudo[33412]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:46 compute-0 sudo[33566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kclodqjozouyjmdpuzzterrdtrudhstn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842646.377898-88-237899337256449/AnsiballZ_stat.py'
Feb 23 10:30:46 compute-0 sudo[33566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:47 compute-0 python3.9[33569]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:30:47 compute-0 sudo[33566]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:47 compute-0 sudo[33719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvqljkyauylnnykkhkwnnrgzemfskwwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842647.2889805-104-106233183549543/AnsiballZ_file.py'
Feb 23 10:30:47 compute-0 sudo[33719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:47 compute-0 python3.9[33722]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:30:47 compute-0 sudo[33719]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:48 compute-0 sudo[33872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjizgituwflofjjuymxudrldkkavhsuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842648.2472086-120-91844707128942/AnsiballZ_stat.py'
Feb 23 10:30:48 compute-0 sudo[33872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:48 compute-0 python3.9[33875]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:30:48 compute-0 sudo[33872]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:49 compute-0 sudo[33996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnnwygfhqvvalrbgluejmfwwwcueasgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842648.2472086-120-91844707128942/AnsiballZ_copy.py'
Feb 23 10:30:49 compute-0 sudo[33996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:49 compute-0 python3.9[33999]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771842648.2472086-120-91844707128942/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:30:49 compute-0 sudo[33996]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:49 compute-0 sshd-session[34024]: Connection closed by authenticating user root 143.198.30.3 port 49408 [preauth]
Feb 23 10:30:50 compute-0 sudo[34151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgebctmpqpgkduiwojdyvojlmczqvziq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842649.7161574-150-158558767361201/AnsiballZ_setup.py'
Feb 23 10:30:50 compute-0 sudo[34151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:50 compute-0 python3.9[34154]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:30:50 compute-0 sudo[34151]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:50 compute-0 sudo[34308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klrartnivkrqwciflnrnvmasicdmuuds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842650.642863-166-140669949361152/AnsiballZ_file.py'
Feb 23 10:30:50 compute-0 sudo[34308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:51 compute-0 python3.9[34311]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:30:51 compute-0 sudo[34308]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:51 compute-0 sudo[34461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjpmneyluajkxuhitgquuvyqxalaszya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842651.446767-184-191693530923121/AnsiballZ_file.py'
Feb 23 10:30:51 compute-0 sudo[34461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:51 compute-0 python3.9[34464]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:30:51 compute-0 sudo[34461]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:52 compute-0 python3.9[34614]: ansible-ansible.builtin.service_facts Invoked
Feb 23 10:30:56 compute-0 python3.9[34868]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:30:56 compute-0 python3.9[35018]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:30:58 compute-0 python3.9[35172]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:30:58 compute-0 sudo[35328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rexlbfyghilvhqtphqklpkhhzzqsqcwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842658.6020691-280-272782406266030/AnsiballZ_setup.py'
Feb 23 10:30:58 compute-0 sudo[35328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:59 compute-0 python3.9[35331]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:30:59 compute-0 sudo[35328]: pam_unix(sudo:session): session closed for user root
Feb 23 10:30:59 compute-0 sudo[35413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfbqlfbfnozbrbyesabssjwtgbwwhjtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842658.6020691-280-272782406266030/AnsiballZ_dnf.py'
Feb 23 10:30:59 compute-0 sudo[35413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:30:59 compute-0 python3.9[35416]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:31:06 compute-0 sshd-session[35485]: Connection closed by authenticating user root 165.227.79.48 port 53640 [preauth]
Feb 23 10:31:24 compute-0 sshd-session[35558]: Connection closed by authenticating user root 143.198.30.3 port 48316 [preauth]
Feb 23 10:31:41 compute-0 systemd[1]: Reloading.
Feb 23 10:31:41 compute-0 systemd-rc-local-generator[35616]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:31:41 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 23 10:31:41 compute-0 systemd[1]: Reloading.
Feb 23 10:31:41 compute-0 systemd-rc-local-generator[35673]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:31:42 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 23 10:31:42 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 23 10:31:42 compute-0 systemd[1]: Reloading.
Feb 23 10:31:42 compute-0 systemd-rc-local-generator[35713]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:31:42 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 23 10:31:42 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 23 10:31:42 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 23 10:31:58 compute-0 sshd-session[35813]: Connection closed by authenticating user root 165.227.79.48 port 49234 [preauth]
Feb 23 10:31:59 compute-0 sshd-session[35819]: Connection closed by authenticating user root 143.198.30.3 port 40020 [preauth]
Feb 23 10:32:34 compute-0 kernel: SELinux:  Converting 2726 SID table entries...
Feb 23 10:32:34 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:32:34 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 23 10:32:34 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:32:34 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:32:34 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:32:34 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:32:34 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:32:34 compute-0 dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 23 10:32:34 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:32:34 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:32:34 compute-0 systemd[1]: Reloading.
Feb 23 10:32:34 compute-0 systemd-rc-local-generator[36048]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:32:34 compute-0 systemd[1]: Starting dnf makecache...
Feb 23 10:32:34 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:32:35 compute-0 dnf[36123]: Failed determining last makecache time.
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-barbican-42b4c41831408a8e323 104 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-python-glean-642fffe0203a8ffcc2443db52 164 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 sshd-session[36420]: Connection closed by authenticating user root 143.198.30.3 port 53900 [preauth]
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-cinder-e95a374f4f00ef02d562d 156 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-python-stevedore-c4acc5639fd2329372142 154 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-python-cloudkitty-tests-tempest-ef9563 134 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-diskimage-builder-cbb4478c143869181ba9 145 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-nova-5cfeecbf22fca58822607dd 157 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-python-designate-tests-tempest-347fdbc 137 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-glance-1fd12c29b339f30fe823e 161 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 sudo[35413]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 153 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-manila-8fa2b5793100022b4d0f6 147 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-python-whitebox-neutron-tests-tempest- 158 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-octavia-76dfc1e35cf7f4dd6102 160 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-watcher-c014f81a8647287f6dcc 151 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:32:35 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:32:35 compute-0 systemd[1]: run-re437d94803594521b4bc33fde18ae990.service: Deactivated successfully.
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-python-tcib-b403f1051724db0286e1418f59 151 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 154 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-swift-dc98a8463506ac520c469a 153 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-python-tempestconf-8e33668cda707818ee1 162 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 dnf[36123]: delorean-openstack-heat-ui-013accbfd179753bc3f0 155 kB/s | 3.0 kB     00:00
Feb 23 10:32:35 compute-0 sudo[37006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cftmdytwvkufqksgmijtfuxiknsfhnki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842755.5295837-304-224606041356442/AnsiballZ_command.py'
Feb 23 10:32:35 compute-0 sudo[37006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:35 compute-0 dnf[36123]: CentOS Stream 9 - BaseOS                         30 kB/s | 7.0 kB     00:00
Feb 23 10:32:35 compute-0 python3.9[37009]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:32:36 compute-0 dnf[36123]: CentOS Stream 9 - AppStream                      29 kB/s | 7.1 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: CentOS Stream 9 - CRB                            65 kB/s | 6.9 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: CentOS Stream 9 - Extras packages                83 kB/s | 7.6 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: dlrn-antelope-testing                           166 kB/s | 3.0 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: dlrn-antelope-build-deps                        155 kB/s | 3.0 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: centos9-rabbitmq                                136 kB/s | 3.0 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: centos9-storage                                 155 kB/s | 3.0 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: centos9-opstools                                144 kB/s | 3.0 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: NFV SIG OpenvSwitch                             144 kB/s | 3.0 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: repo-setup-centos-appstream                     193 kB/s | 4.4 kB     00:00
Feb 23 10:32:36 compute-0 sudo[37006]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:36 compute-0 dnf[36123]: repo-setup-centos-baseos                        160 kB/s | 3.9 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: repo-setup-centos-highavailability              177 kB/s | 3.9 kB     00:00
Feb 23 10:32:36 compute-0 dnf[36123]: repo-setup-centos-powertools                    190 kB/s | 4.3 kB     00:00
Feb 23 10:32:37 compute-0 dnf[36123]: Extra Packages for Enterprise Linux 9 - x86_64  140 kB/s |  29 kB     00:00
Feb 23 10:32:37 compute-0 dnf[36123]: Metadata cache created.
Feb 23 10:32:37 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 23 10:32:37 compute-0 systemd[1]: Finished dnf makecache.
Feb 23 10:32:37 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.663s CPU time.
Feb 23 10:32:37 compute-0 sudo[37310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jptdiyewofxggghgkuuargbeqliaebsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842757.1563709-320-31149329208705/AnsiballZ_selinux.py'
Feb 23 10:32:37 compute-0 sudo[37310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:38 compute-0 python3.9[37313]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 23 10:32:38 compute-0 sudo[37310]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:38 compute-0 sudo[37464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvtshwcuabjolopwhujqqcmeetuexzqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842758.481161-342-102313860253283/AnsiballZ_command.py'
Feb 23 10:32:38 compute-0 sudo[37464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:38 compute-0 python3.9[37467]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 23 10:32:39 compute-0 sudo[37464]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:39 compute-0 sudo[37618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztvkpnuiyrntjjnreqkrlewclfzdyykc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842759.6718671-358-254513732549359/AnsiballZ_file.py'
Feb 23 10:32:39 compute-0 sudo[37618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:40 compute-0 python3.9[37621]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:32:40 compute-0 sudo[37618]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:41 compute-0 sudo[37771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdsbhjrypwnntlvjcaoxhkhwshzfxwfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842760.9872735-374-236176763578875/AnsiballZ_mount.py'
Feb 23 10:32:41 compute-0 sudo[37771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:41 compute-0 python3.9[37774]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 23 10:32:41 compute-0 sudo[37771]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:42 compute-0 sudo[37924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqfhdbsbscwhvyivrruvsutndfpkkngc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842762.4800563-430-169257401915691/AnsiballZ_file.py'
Feb 23 10:32:42 compute-0 sudo[37924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:42 compute-0 python3.9[37927]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:32:42 compute-0 sudo[37924]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:43 compute-0 sudo[38077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhlwuwkdwniretjtixilhdkaqthsqrus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842763.1043575-446-150425818551069/AnsiballZ_stat.py'
Feb 23 10:32:43 compute-0 sudo[38077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:43 compute-0 python3.9[38080]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:32:43 compute-0 sudo[38077]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:43 compute-0 sudo[38201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdjbgyygfmsssynpzlecqgscbhvzpwyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842763.1043575-446-150425818551069/AnsiballZ_copy.py'
Feb 23 10:32:43 compute-0 sudo[38201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:43 compute-0 python3.9[38204]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771842763.1043575-446-150425818551069/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:32:43 compute-0 sudo[38201]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:44 compute-0 sudo[38354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukoyuspsdvocaekqolrsvqzeksocqmvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842764.5965931-494-108469280382766/AnsiballZ_stat.py'
Feb 23 10:32:44 compute-0 sudo[38354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:47 compute-0 python3.9[38357]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:32:47 compute-0 sudo[38354]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:47 compute-0 sshd-session[38417]: Connection closed by authenticating user root 165.227.79.48 port 33574 [preauth]
Feb 23 10:32:48 compute-0 sudo[38509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrrpevdqfpemqymplhairkgxnkxupafr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842767.8373537-510-248397456194141/AnsiballZ_command.py'
Feb 23 10:32:48 compute-0 sudo[38509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:48 compute-0 python3.9[38512]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:32:48 compute-0 sudo[38509]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:48 compute-0 sudo[38663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjdebqfddmzrkebcreypuahjdgvrumr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842768.5852401-526-39320392856693/AnsiballZ_file.py'
Feb 23 10:32:48 compute-0 sudo[38663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:49 compute-0 python3.9[38666]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:32:49 compute-0 sudo[38663]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:49 compute-0 sudo[38816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltohztyosihmfsunucainveclteuzshv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842769.4297433-548-156904700879969/AnsiballZ_getent.py'
Feb 23 10:32:49 compute-0 sudo[38816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:49 compute-0 python3.9[38819]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 23 10:32:49 compute-0 sudo[38816]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:49 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:32:50 compute-0 sudo[38971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbjfifhulwxxfdqizugvitytauysghr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842770.2095695-564-1139947681465/AnsiballZ_group.py'
Feb 23 10:32:50 compute-0 sudo[38971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:50 compute-0 python3.9[38974]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 10:32:50 compute-0 groupadd[38975]: group added to /etc/group: name=qemu, GID=107
Feb 23 10:32:50 compute-0 groupadd[38975]: group added to /etc/gshadow: name=qemu
Feb 23 10:32:50 compute-0 groupadd[38975]: new group: name=qemu, GID=107
Feb 23 10:32:50 compute-0 sudo[38971]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:51 compute-0 sudo[39130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwvlpyvbrilndfmhlmenplcxnyftttti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842771.064644-580-224570913092055/AnsiballZ_user.py'
Feb 23 10:32:51 compute-0 sudo[39130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:51 compute-0 python3.9[39133]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 10:32:51 compute-0 useradd[39135]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/1
Feb 23 10:32:51 compute-0 sudo[39130]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:52 compute-0 sudo[39291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyvxjsqortzlrfkqtjqcfhgkbswlbmid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842772.0398192-596-141446840846113/AnsiballZ_getent.py'
Feb 23 10:32:52 compute-0 sudo[39291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:52 compute-0 python3.9[39294]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 23 10:32:52 compute-0 sudo[39291]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:52 compute-0 sudo[39445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqgzdwxsqrrepczmpypqxhffxiqvvfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842772.6945968-612-155978049269344/AnsiballZ_group.py'
Feb 23 10:32:54 compute-0 sudo[39445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:54 compute-0 python3.9[39448]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 10:32:54 compute-0 groupadd[39449]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 23 10:32:54 compute-0 groupadd[39449]: group added to /etc/gshadow: name=hugetlbfs
Feb 23 10:32:54 compute-0 groupadd[39449]: new group: name=hugetlbfs, GID=42477
Feb 23 10:32:54 compute-0 sudo[39445]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:54 compute-0 sudo[39604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cupurxdkrxcraowdpighlndnvxdbagxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842774.7302318-630-185302128218972/AnsiballZ_file.py'
Feb 23 10:32:54 compute-0 sudo[39604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:55 compute-0 python3.9[39607]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 23 10:32:55 compute-0 sudo[39604]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:55 compute-0 sudo[39757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhfvyahteujmiwkppdzwqqjkriufglvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842775.6438563-652-45310409524155/AnsiballZ_dnf.py'
Feb 23 10:32:55 compute-0 sudo[39757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:56 compute-0 python3.9[39760]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:32:57 compute-0 sudo[39757]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:58 compute-0 sudo[39911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coabhubqqbwpopkkrrhldjdnrisnovnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842777.797943-668-43112723775848/AnsiballZ_file.py'
Feb 23 10:32:58 compute-0 sudo[39911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:58 compute-0 python3.9[39914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:32:58 compute-0 sudo[39911]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:58 compute-0 sudo[40064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlbdylndrzznfrzudkpeyuqvjaqrahou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842778.4355688-684-188500616590491/AnsiballZ_stat.py'
Feb 23 10:32:58 compute-0 sudo[40064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:58 compute-0 python3.9[40067]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:32:58 compute-0 sudo[40064]: pam_unix(sudo:session): session closed for user root
Feb 23 10:32:59 compute-0 sudo[40188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-popgfvnxqbsmvgdkgritbhhuiqscbyfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842778.4355688-684-188500616590491/AnsiballZ_copy.py'
Feb 23 10:32:59 compute-0 sudo[40188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:32:59 compute-0 python3.9[40191]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771842778.4355688-684-188500616590491/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:32:59 compute-0 sudo[40188]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:00 compute-0 sudo[40341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvwvokfwvqyrjywgtayovwvvkxckrjik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842779.5039167-714-148636918069458/AnsiballZ_systemd.py'
Feb 23 10:33:00 compute-0 sudo[40341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:00 compute-0 python3.9[40344]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:33:00 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 23 10:33:00 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 23 10:33:00 compute-0 kernel: Bridge firewalling registered
Feb 23 10:33:00 compute-0 systemd-modules-load[40348]: Inserted module 'br_netfilter'
Feb 23 10:33:00 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 23 10:33:00 compute-0 sudo[40341]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:01 compute-0 sudo[40501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjctatwcmmiuxgunzidlhjwrtwwhmhlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842780.8181882-730-219901847666834/AnsiballZ_stat.py'
Feb 23 10:33:01 compute-0 sudo[40501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:01 compute-0 python3.9[40504]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:33:01 compute-0 sudo[40501]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:01 compute-0 sudo[40625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpegejviutmbbodcqsktvjsmczctmbgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842780.8181882-730-219901847666834/AnsiballZ_copy.py'
Feb 23 10:33:01 compute-0 sudo[40625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:01 compute-0 python3.9[40628]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771842780.8181882-730-219901847666834/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:33:01 compute-0 sudo[40625]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:02 compute-0 sudo[40778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilfjfwdymqdyfepwkvukfujpbxeudgdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842782.1492662-766-9236209879799/AnsiballZ_dnf.py'
Feb 23 10:33:02 compute-0 sudo[40778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:02 compute-0 python3.9[40781]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:33:05 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 23 10:33:05 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 23 10:33:05 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:33:05 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:33:05 compute-0 systemd[1]: Reloading.
Feb 23 10:33:06 compute-0 systemd-rc-local-generator[40839]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:33:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:33:06 compute-0 sudo[40778]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:07 compute-0 python3.9[42753]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:33:08 compute-0 python3.9[44248]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 23 10:33:08 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:33:08 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:33:08 compute-0 systemd[1]: man-db-cache-update.service: Consumed 3.013s CPU time.
Feb 23 10:33:08 compute-0 systemd[1]: run-r5526096ca1d3496589d4e2a10a74f71a.service: Deactivated successfully.
Feb 23 10:33:08 compute-0 python3.9[44880]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:33:09 compute-0 sudo[45030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybwtqjzdxtxhbicmaohpagywanpvyata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842789.1248424-844-244365579897231/AnsiballZ_command.py'
Feb 23 10:33:09 compute-0 sudo[45030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:09 compute-0 python3.9[45033]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:33:09 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 23 10:33:09 compute-0 systemd[1]: Starting Authorization Manager...
Feb 23 10:33:09 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 23 10:33:09 compute-0 polkitd[45250]: Started polkitd version 0.117
Feb 23 10:33:09 compute-0 polkitd[45250]: Loading rules from directory /etc/polkit-1/rules.d
Feb 23 10:33:09 compute-0 polkitd[45250]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 23 10:33:09 compute-0 polkitd[45250]: Finished loading, compiling and executing 2 rules
Feb 23 10:33:09 compute-0 polkitd[45250]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 23 10:33:09 compute-0 systemd[1]: Started Authorization Manager.
Feb 23 10:33:09 compute-0 sudo[45030]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:09 compute-0 sshd-session[45253]: Connection closed by authenticating user root 143.198.30.3 port 50524 [preauth]
Feb 23 10:33:10 compute-0 sudo[45420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vshkfxsptvezzgzmepglyjwzwctruuzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842790.5202844-862-253434075918283/AnsiballZ_systemd.py'
Feb 23 10:33:10 compute-0 sudo[45420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:11 compute-0 python3.9[45423]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:33:11 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 23 10:33:11 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Feb 23 10:33:11 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 23 10:33:11 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 23 10:33:11 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 23 10:33:11 compute-0 sudo[45420]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:12 compute-0 python3.9[45584]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 23 10:33:14 compute-0 sudo[45734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjnoriovdqtvqxspuokkdirftvigdnsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842794.6532075-976-276109057066821/AnsiballZ_systemd.py'
Feb 23 10:33:14 compute-0 sudo[45734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:15 compute-0 python3.9[45737]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:33:15 compute-0 systemd[1]: Reloading.
Feb 23 10:33:15 compute-0 systemd-rc-local-generator[45762]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:33:15 compute-0 sudo[45734]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:15 compute-0 sudo[45931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyloqpcmkkeqllzksniqxuywqfeplwon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842795.430904-976-72340884226276/AnsiballZ_systemd.py'
Feb 23 10:33:15 compute-0 sudo[45931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:15 compute-0 python3.9[45934]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:33:15 compute-0 systemd[1]: Reloading.
Feb 23 10:33:16 compute-0 systemd-rc-local-generator[45961]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:33:16 compute-0 sudo[45931]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:16 compute-0 sudo[46128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfjfkfgilgdkfrldkafuzlpnugtszzkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842796.5838208-1008-118419319720707/AnsiballZ_command.py'
Feb 23 10:33:16 compute-0 sudo[46128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:17 compute-0 python3.9[46131]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:33:17 compute-0 sudo[46128]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:17 compute-0 sudo[46282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzkmszejskhdbarscyswdgtkrmdmsqxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842797.2855105-1024-11774124394319/AnsiballZ_command.py'
Feb 23 10:33:17 compute-0 sudo[46282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:17 compute-0 python3.9[46285]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:33:17 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 23 10:33:17 compute-0 sudo[46282]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:18 compute-0 sudo[46436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjxeceehvpmzgbrvxirhxdkgsjywpnbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842797.9963503-1040-118218063334292/AnsiballZ_command.py'
Feb 23 10:33:18 compute-0 sudo[46436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:18 compute-0 python3.9[46439]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:33:19 compute-0 sudo[46436]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:20 compute-0 sudo[46599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqbxhqbtvvutygqwudofkcmdqjodaiow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842799.9261534-1056-54400598653239/AnsiballZ_command.py'
Feb 23 10:33:20 compute-0 sudo[46599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:20 compute-0 python3.9[46602]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:33:20 compute-0 sudo[46599]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:20 compute-0 sudo[46753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-youqsvixtiywdcvdxaffplivkwbfwogs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842800.5576518-1072-147345165524938/AnsiballZ_systemd.py'
Feb 23 10:33:20 compute-0 sudo[46753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:21 compute-0 python3.9[46756]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:33:21 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 23 10:33:21 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Feb 23 10:33:21 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Feb 23 10:33:21 compute-0 systemd[1]: Starting Apply Kernel Variables...
Feb 23 10:33:21 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 23 10:33:21 compute-0 systemd[1]: Finished Apply Kernel Variables.
Feb 23 10:33:21 compute-0 sudo[46753]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:21 compute-0 sshd-session[32938]: Connection closed by 192.168.122.30 port 45672
Feb 23 10:33:21 compute-0 sshd-session[32935]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:33:21 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Feb 23 10:33:21 compute-0 systemd[1]: session-10.scope: Consumed 1min 57.211s CPU time.
Feb 23 10:33:21 compute-0 systemd-logind[808]: Session 10 logged out. Waiting for processes to exit.
Feb 23 10:33:21 compute-0 systemd-logind[808]: Removed session 10.
Feb 23 10:33:27 compute-0 sshd-session[46787]: Accepted publickey for zuul from 192.168.122.30 port 57524 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:33:27 compute-0 systemd-logind[808]: New session 11 of user zuul.
Feb 23 10:33:27 compute-0 systemd[1]: Started Session 11 of User zuul.
Feb 23 10:33:27 compute-0 sshd-session[46787]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:33:27 compute-0 python3.9[46940]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:33:29 compute-0 python3.9[47094]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:33:29 compute-0 sudo[47248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjjawdzjgfpnjrzvboenaqmpqfmmztzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842809.6197817-75-36583495963736/AnsiballZ_command.py'
Feb 23 10:33:30 compute-0 sudo[47248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:30 compute-0 python3.9[47251]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:33:30 compute-0 sudo[47248]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:31 compute-0 python3.9[47402]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:33:31 compute-0 sudo[47556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrkuqrfgrkicczlpcckzbpttynovssok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842811.3380089-115-178984019161491/AnsiballZ_setup.py'
Feb 23 10:33:31 compute-0 sudo[47556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:31 compute-0 python3.9[47559]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:33:32 compute-0 sudo[47556]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:32 compute-0 sudo[47641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qckcuahguavhlmwmuhqzwswoquaipmuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842811.3380089-115-178984019161491/AnsiballZ_dnf.py'
Feb 23 10:33:32 compute-0 sudo[47641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:32 compute-0 python3.9[47644]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:33:33 compute-0 sudo[47641]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:34 compute-0 sudo[47795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfoticigqymmzkzxqnizxnsvdckixapx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842813.9929008-139-138169227860404/AnsiballZ_setup.py'
Feb 23 10:33:34 compute-0 sudo[47795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:34 compute-0 python3.9[47798]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:33:34 compute-0 sudo[47795]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:35 compute-0 sudo[47967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaeifvucaniiltxzxhfgwdbexjpfkxjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842814.9243815-161-229069736618448/AnsiballZ_file.py'
Feb 23 10:33:35 compute-0 sudo[47967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:35 compute-0 python3.9[47970]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:33:35 compute-0 sudo[47967]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:35 compute-0 sudo[48120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieyifmmwxzlaeavzeusrsaqscsrrwpyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842815.7276452-177-221902941425077/AnsiballZ_command.py'
Feb 23 10:33:35 compute-0 sudo[48120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:36 compute-0 python3.9[48123]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:33:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3509637939-merged.mount: Deactivated successfully.
Feb 23 10:33:36 compute-0 podman[48124]: 2026-02-23 10:33:36.196898956 +0000 UTC m=+0.039432750 system refresh
Feb 23 10:33:36 compute-0 sudo[48120]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:36 compute-0 sudo[48285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkaiijnwnnqtaedjwmnwxqlozygxrhfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842816.4580798-193-43580365653053/AnsiballZ_stat.py'
Feb 23 10:33:36 compute-0 sudo[48285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:37 compute-0 python3.9[48288]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:33:37 compute-0 sudo[48285]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:33:37 compute-0 sudo[48409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmnqzywmdfmnlivlvzebuwxctlqimflq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842816.4580798-193-43580365653053/AnsiballZ_copy.py'
Feb 23 10:33:37 compute-0 sudo[48409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:37 compute-0 python3.9[48412]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771842816.4580798-193-43580365653053/.source.json follow=False _original_basename=podman_network_config.j2 checksum=5d9a65236850491c7d4993d8b3b2683d962871e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:33:37 compute-0 sudo[48409]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:38 compute-0 sudo[48562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdolrulrhhqjhqxupaubxwoaahgdefvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842817.8316944-223-221987307448495/AnsiballZ_stat.py'
Feb 23 10:33:38 compute-0 sudo[48562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:38 compute-0 python3.9[48565]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:33:38 compute-0 sudo[48562]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:38 compute-0 sudo[48686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kocuzrcguuyixgycgmqoqaxpknqbdhzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842817.8316944-223-221987307448495/AnsiballZ_copy.py'
Feb 23 10:33:38 compute-0 sudo[48686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:38 compute-0 python3.9[48689]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771842817.8316944-223-221987307448495/.source.conf follow=False _original_basename=registries.conf.j2 checksum=ac70d66c4b5ab2334ac0357b84986ea734e0f27b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:33:38 compute-0 sudo[48686]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:39 compute-0 sudo[48839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqnhrggfqrycaojdkkmpvciqhkpvntqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842819.1069732-255-156934710154833/AnsiballZ_ini_file.py'
Feb 23 10:33:39 compute-0 sudo[48839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:39 compute-0 python3.9[48842]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:33:39 compute-0 sudo[48839]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:40 compute-0 sudo[48992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-carpgnnfyfoluncoofhsrimnwduqmhdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842819.871086-255-187256785206181/AnsiballZ_ini_file.py'
Feb 23 10:33:40 compute-0 sudo[48992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:40 compute-0 python3.9[48995]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:33:40 compute-0 sudo[48992]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:40 compute-0 sudo[49145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grrbdvmpuqxjmijopaadhayjrbxxkokd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842820.524734-255-230422794924493/AnsiballZ_ini_file.py'
Feb 23 10:33:40 compute-0 sudo[49145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:40 compute-0 python3.9[49148]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:33:41 compute-0 sudo[49145]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:41 compute-0 sudo[49298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tacshxyjvdskmncrongrqqhhaagrwpkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842821.1003852-255-42300725653953/AnsiballZ_ini_file.py'
Feb 23 10:33:41 compute-0 sudo[49298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:41 compute-0 python3.9[49301]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:33:41 compute-0 sudo[49298]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:42 compute-0 python3.9[49451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:33:42 compute-0 sudo[49603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmnqviqpsojhfniavufjtnnoraegbndo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842822.4871924-335-218818056400229/AnsiballZ_dnf.py'
Feb 23 10:33:42 compute-0 sudo[49603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:42 compute-0 python3.9[49606]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:33:43 compute-0 sshd-session[49608]: Connection closed by authenticating user root 165.227.79.48 port 35812 [preauth]
Feb 23 10:33:44 compute-0 sudo[49603]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:44 compute-0 sshd-session[49610]: Connection closed by authenticating user root 143.198.30.3 port 55552 [preauth]
Feb 23 10:33:44 compute-0 sudo[49761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaocwkonhghkpkyvmubblqttjqxqshlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842824.3243582-351-22104535016486/AnsiballZ_dnf.py'
Feb 23 10:33:44 compute-0 sudo[49761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:44 compute-0 python3.9[49764]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:33:46 compute-0 sudo[49761]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:46 compute-0 sudo[49923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kghtodopzvguvswtalcoeztxryruxmtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842826.7811615-371-114232624257788/AnsiballZ_dnf.py'
Feb 23 10:33:46 compute-0 sudo[49923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:47 compute-0 python3.9[49926]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:33:48 compute-0 sudo[49923]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:48 compute-0 sudo[50078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmeajhdfjdaowfuynwqrvpyytbinbnnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842828.6649175-389-270361972116609/AnsiballZ_dnf.py'
Feb 23 10:33:48 compute-0 sudo[50078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:49 compute-0 python3.9[50081]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:33:50 compute-0 sudo[50078]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:50 compute-0 sudo[50232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-injjhxhoymrufsaphcjynkwucollhcbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842830.6834886-411-181997025566237/AnsiballZ_dnf.py'
Feb 23 10:33:50 compute-0 sudo[50232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:51 compute-0 python3.9[50235]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:33:52 compute-0 sudo[50232]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:52 compute-0 sudo[50389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zunklmmntwwolaxpbgaxnprccmnpcpeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842832.7262096-427-213528712876801/AnsiballZ_dnf.py'
Feb 23 10:33:52 compute-0 sudo[50389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:53 compute-0 python3.9[50392]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:33:55 compute-0 sudo[50389]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:55 compute-0 sudo[50559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkrjvpqrqvpyxezteyyqecglvhskhevv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842835.5356536-445-75371714416565/AnsiballZ_dnf.py'
Feb 23 10:33:55 compute-0 sudo[50559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:55 compute-0 python3.9[50562]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:33:57 compute-0 sudo[50559]: pam_unix(sudo:session): session closed for user root
Feb 23 10:33:57 compute-0 sudo[50713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkcmkxqycotugzbhxmmjzmjkoejuyvja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842837.4328046-463-220127066100670/AnsiballZ_dnf.py'
Feb 23 10:33:57 compute-0 sudo[50713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:33:57 compute-0 python3.9[50716]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:34:08 compute-0 sudo[50713]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:09 compute-0 sudo[51049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjkpolvqiqpmwjzbupjqadkrmllxcpev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842849.1977441-481-4354983484176/AnsiballZ_dnf.py'
Feb 23 10:34:09 compute-0 sudo[51049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:09 compute-0 python3.9[51052]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:34:10 compute-0 sudo[51049]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:11 compute-0 sudo[51206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nusupdklytacwhkisghpxhavfqravyrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842851.210986-501-25712606016517/AnsiballZ_dnf.py'
Feb 23 10:34:11 compute-0 sudo[51206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:11 compute-0 python3.9[51209]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:34:13 compute-0 sudo[51206]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:14 compute-0 sudo[51364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shgqioqkblkowqgqhtxelnzkkdvqwxyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842853.9003742-523-180696339333113/AnsiballZ_file.py'
Feb 23 10:34:14 compute-0 sudo[51364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:14 compute-0 python3.9[51367]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:34:14 compute-0 sudo[51364]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:14 compute-0 sudo[51540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llsviutqtbhqmgoxzbvdmrjbncubvoyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842854.5589187-539-36834020640465/AnsiballZ_stat.py'
Feb 23 10:34:14 compute-0 sudo[51540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:14 compute-0 python3.9[51543]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:34:15 compute-0 sudo[51540]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:15 compute-0 sudo[51664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caumykkozycyfmedtdzoqsbmztdqtvbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842854.5589187-539-36834020640465/AnsiballZ_copy.py'
Feb 23 10:34:15 compute-0 sudo[51664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:15 compute-0 python3.9[51667]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771842854.5589187-539-36834020640465/.source.json _original_basename=.uocdtqst follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:34:15 compute-0 sudo[51664]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:16 compute-0 sudo[51817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onkigzszuybibsmbegoibgsjdflfttnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842855.978945-575-251391295310589/AnsiballZ_podman_image.py'
Feb 23 10:34:16 compute-0 sudo[51817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:16 compute-0 python3.9[51820]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 10:34:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1338072193-lower\x2dmapped.mount: Deactivated successfully.
Feb 23 10:34:19 compute-0 sshd-session[51882]: Connection closed by authenticating user root 143.198.30.3 port 54896 [preauth]
Feb 23 10:34:20 compute-0 podman[51832]: 2026-02-23 10:34:20.979452091 +0000 UTC m=+4.340580446 image pull bfb93be9d83c3121be0312d4d8c02944841d931c726f68b412221913286262d4 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 23 10:34:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:22 compute-0 sudo[51817]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:22 compute-0 sudo[52128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dstdlxbubgavzstmliqeiwzfsxpuruwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842862.6332297-597-57904247594657/AnsiballZ_podman_image.py'
Feb 23 10:34:22 compute-0 sudo[52128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:23 compute-0 python3.9[52131]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 10:34:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:30 compute-0 podman[52144]: 2026-02-23 10:34:30.532380805 +0000 UTC m=+7.414128771 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 10:34:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:30 compute-0 sudo[52128]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:32 compute-0 sudo[52440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cycrmzsxtkyhymimuvdfjgfarmlkvngl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842871.884766-617-126234938035199/AnsiballZ_podman_image.py'
Feb 23 10:34:32 compute-0 sudo[52440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:32 compute-0 python3.9[52443]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 10:34:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:33 compute-0 sshd-session[52479]: Connection closed by authenticating user root 165.227.79.48 port 38348 [preauth]
Feb 23 10:34:40 compute-0 podman[52456]: 2026-02-23 10:34:40.045404827 +0000 UTC m=+7.584333352 image pull 72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 10:34:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:40 compute-0 sudo[52440]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:41 compute-0 sudo[52716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tatdrfbeoyrywioitbvobodqcrwysbtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842881.429841-639-107951984971570/AnsiballZ_podman_image.py'
Feb 23 10:34:41 compute-0 sudo[52716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:41 compute-0 python3.9[52719]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 10:34:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:45 compute-0 podman[52732]: 2026-02-23 10:34:45.140274633 +0000 UTC m=+3.217247699 image pull 06e96a8544ce5b1764a2938311eff3a4e150b0db6e81ca441c51cfb1ef2d06f7 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 23 10:34:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:45 compute-0 sudo[52716]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:45 compute-0 sudo[52985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blgmfmfmyujgihkzznatpyqmnbfrtxyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842885.389933-639-51323531188526/AnsiballZ_podman_image.py'
Feb 23 10:34:45 compute-0 sudo[52985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:45 compute-0 python3.9[52988]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 10:34:46 compute-0 podman[53000]: 2026-02-23 10:34:46.878741741 +0000 UTC m=+1.034649237 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 23 10:34:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:47 compute-0 sudo[52985]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:34:47 compute-0 sshd-session[46790]: Connection closed by 192.168.122.30 port 57524
Feb 23 10:34:47 compute-0 sshd-session[46787]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:34:47 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Feb 23 10:34:47 compute-0 systemd[1]: session-11.scope: Consumed 1min 27.092s CPU time.
Feb 23 10:34:47 compute-0 systemd-logind[808]: Session 11 logged out. Waiting for processes to exit.
Feb 23 10:34:47 compute-0 systemd-logind[808]: Removed session 11.
Feb 23 10:34:48 compute-0 sshd-session[52802]: Connection reset by 205.210.31.172 port 60936 [preauth]
Feb 23 10:34:53 compute-0 sshd-session[53150]: Accepted publickey for zuul from 192.168.122.30 port 35212 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:34:53 compute-0 systemd-logind[808]: New session 12 of user zuul.
Feb 23 10:34:53 compute-0 systemd[1]: Started Session 12 of User zuul.
Feb 23 10:34:53 compute-0 sshd-session[53150]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:34:54 compute-0 sshd-session[53304]: Connection closed by authenticating user root 143.198.30.3 port 40018 [preauth]
Feb 23 10:34:54 compute-0 python3.9[53303]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:34:55 compute-0 sudo[53459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzqrvsadyyxsbljdqlpoaknlbatcdqpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842894.7196374-49-186071240210193/AnsiballZ_getent.py'
Feb 23 10:34:55 compute-0 sudo[53459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:55 compute-0 python3.9[53462]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 23 10:34:55 compute-0 sudo[53459]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:55 compute-0 sudo[53613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txznuqhnehbzzzdqrosfnfwvfnkawodv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842895.5340385-65-26892613433225/AnsiballZ_group.py'
Feb 23 10:34:55 compute-0 sudo[53613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:56 compute-0 python3.9[53616]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 10:34:56 compute-0 groupadd[53617]: group added to /etc/group: name=openvswitch, GID=42476
Feb 23 10:34:56 compute-0 groupadd[53617]: group added to /etc/gshadow: name=openvswitch
Feb 23 10:34:56 compute-0 groupadd[53617]: new group: name=openvswitch, GID=42476
Feb 23 10:34:56 compute-0 sudo[53613]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:56 compute-0 sudo[53772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tboquqijjwjdgwaindzlqwytqlvtuavy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842896.3438764-81-129938148812391/AnsiballZ_user.py'
Feb 23 10:34:56 compute-0 sudo[53772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:56 compute-0 python3.9[53775]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 10:34:56 compute-0 useradd[53777]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/1
Feb 23 10:34:56 compute-0 useradd[53777]: add 'openvswitch' to group 'hugetlbfs'
Feb 23 10:34:56 compute-0 useradd[53777]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 23 10:34:57 compute-0 sudo[53772]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:57 compute-0 sudo[53933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzagfjmrcywlaqbfekzekvmdhgdidsat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842897.5751178-101-176857757768929/AnsiballZ_setup.py'
Feb 23 10:34:57 compute-0 sudo[53933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:58 compute-0 python3.9[53936]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:34:58 compute-0 sudo[53933]: pam_unix(sudo:session): session closed for user root
Feb 23 10:34:58 compute-0 sudo[54018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykvmzzvhnggpmwtzhoceazzxndlqxiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842897.5751178-101-176857757768929/AnsiballZ_dnf.py'
Feb 23 10:34:58 compute-0 sudo[54018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:34:59 compute-0 python3.9[54021]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:35:00 compute-0 sudo[54018]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:02 compute-0 sudo[54181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xttewafbgornscrdblqdyflygoddljar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842901.8176646-129-49625610045188/AnsiballZ_dnf.py'
Feb 23 10:35:02 compute-0 sudo[54181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:02 compute-0 python3.9[54184]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:35:15 compute-0 kernel: SELinux:  Converting 2740 SID table entries...
Feb 23 10:35:15 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:35:15 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 23 10:35:15 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:35:15 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:35:15 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:35:15 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:35:15 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:35:15 compute-0 groupadd[54211]: group added to /etc/group: name=unbound, GID=994
Feb 23 10:35:15 compute-0 groupadd[54211]: group added to /etc/gshadow: name=unbound
Feb 23 10:35:15 compute-0 groupadd[54211]: new group: name=unbound, GID=994
Feb 23 10:35:16 compute-0 useradd[54218]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 23 10:35:16 compute-0 dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 23 10:35:16 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 23 10:35:17 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:35:17 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:35:17 compute-0 systemd[1]: Reloading.
Feb 23 10:35:17 compute-0 systemd-rc-local-generator[54709]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:35:17 compute-0 systemd-sysv-generator[54717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:35:17 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:35:17 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:35:17 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:35:17 compute-0 systemd[1]: run-r4f48295a8207488faab65c2f91a7ec0a.service: Deactivated successfully.
Feb 23 10:35:17 compute-0 sudo[54181]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:18 compute-0 sudo[55309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvonucfisnmdsmkhifwescdlxjhyywfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842917.852247-145-19825571102066/AnsiballZ_systemd.py'
Feb 23 10:35:18 compute-0 sudo[55309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:18 compute-0 python3.9[55312]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:35:18 compute-0 systemd[1]: Reloading.
Feb 23 10:35:18 compute-0 systemd-rc-local-generator[55334]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:35:18 compute-0 systemd-sysv-generator[55343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:35:18 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Feb 23 10:35:18 compute-0 chown[55360]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 23 10:35:18 compute-0 ovs-ctl[55365]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 23 10:35:18 compute-0 ovs-ctl[55365]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 23 10:35:19 compute-0 ovs-ctl[55365]: Starting ovsdb-server [  OK  ]
Feb 23 10:35:19 compute-0 ovs-vsctl[55414]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 23 10:35:19 compute-0 ovs-vsctl[55434]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"260ff7a6-2911-481e-914f-54dc92f9c3bf\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 23 10:35:19 compute-0 ovs-ctl[55365]: Configuring Open vSwitch system IDs [  OK  ]
Feb 23 10:35:19 compute-0 ovs-ctl[55365]: Enabling remote OVSDB managers [  OK  ]
Feb 23 10:35:19 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Feb 23 10:35:19 compute-0 ovs-vsctl[55440]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 23 10:35:19 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 23 10:35:19 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 23 10:35:19 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 23 10:35:19 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Feb 23 10:35:19 compute-0 ovs-ctl[55484]: Inserting openvswitch module [  OK  ]
Feb 23 10:35:19 compute-0 ovs-ctl[55453]: Starting ovs-vswitchd [  OK  ]
Feb 23 10:35:19 compute-0 ovs-vsctl[55502]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 23 10:35:19 compute-0 ovs-ctl[55453]: Enabling remote OVSDB managers [  OK  ]
Feb 23 10:35:19 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 23 10:35:19 compute-0 systemd[1]: Starting Open vSwitch...
Feb 23 10:35:19 compute-0 systemd[1]: Finished Open vSwitch.
Feb 23 10:35:19 compute-0 sudo[55309]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:20 compute-0 python3.9[55653]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:35:21 compute-0 sudo[55803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-claoeqkxyfpienfaaedogldfkgotfhow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842921.0371833-183-214075261328782/AnsiballZ_sefcontext.py'
Feb 23 10:35:21 compute-0 sudo[55803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:21 compute-0 python3.9[55806]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 23 10:35:22 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Feb 23 10:35:22 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:35:22 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 23 10:35:22 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:35:22 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:35:22 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:35:22 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:35:22 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:35:22 compute-0 sudo[55803]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:23 compute-0 python3.9[55961]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:35:24 compute-0 sudo[56117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzvplqauspqgqerecukpbxlrezbfpjzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842924.1319358-219-262259582710260/AnsiballZ_dnf.py'
Feb 23 10:35:24 compute-0 dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 23 10:35:24 compute-0 sudo[56117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:24 compute-0 python3.9[56120]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:35:24 compute-0 sshd-session[56121]: Connection closed by authenticating user root 165.227.79.48 port 57586 [preauth]
Feb 23 10:35:25 compute-0 sudo[56117]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:26 compute-0 sudo[56273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhttkmnlicszaeitkmclttecjaeyopcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842925.975278-235-124132327076836/AnsiballZ_command.py'
Feb 23 10:35:26 compute-0 sudo[56273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:26 compute-0 python3.9[56276]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:35:27 compute-0 sudo[56273]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:27 compute-0 sudo[56561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yipwmlcwfyolluymwearcqrfrsgxqcyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842927.4787865-251-171813103825170/AnsiballZ_file.py'
Feb 23 10:35:27 compute-0 sudo[56561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:28 compute-0 python3.9[56564]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 23 10:35:28 compute-0 sudo[56561]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:28 compute-0 sshd-session[56688]: Connection closed by authenticating user root 143.198.30.3 port 56140 [preauth]
Feb 23 10:35:28 compute-0 python3.9[56716]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:35:29 compute-0 sudo[56868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-affyabmleiyvwjltlswxcboutlxzhrmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842929.2039657-283-115860055882129/AnsiballZ_dnf.py'
Feb 23 10:35:29 compute-0 sudo[56868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:29 compute-0 python3.9[56871]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:35:31 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:35:31 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:35:31 compute-0 systemd[1]: Reloading.
Feb 23 10:35:31 compute-0 systemd-rc-local-generator[56911]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:35:31 compute-0 systemd-sysv-generator[56914]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:35:31 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:35:32 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:35:32 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:35:32 compute-0 systemd[1]: run-rdd7c46b0be874bca8dc9e8e1655411a4.service: Deactivated successfully.
Feb 23 10:35:32 compute-0 sudo[56868]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:32 compute-0 sudo[57192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sswufkmrzjrfkelipjkastyujdlsmbsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842932.332664-299-51782545555341/AnsiballZ_systemd.py'
Feb 23 10:35:32 compute-0 sudo[57192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:32 compute-0 python3.9[57195]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:35:32 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 23 10:35:32 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Feb 23 10:35:32 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Feb 23 10:35:32 compute-0 NetworkManager[7689]: <info>  [1771842932.9344] caught SIGTERM, shutting down normally.
Feb 23 10:35:32 compute-0 systemd[1]: Stopping Network Manager...
Feb 23 10:35:32 compute-0 NetworkManager[7689]: <info>  [1771842932.9352] dhcp4 (eth0): canceled DHCP transaction
Feb 23 10:35:32 compute-0 NetworkManager[7689]: <info>  [1771842932.9353] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:35:32 compute-0 NetworkManager[7689]: <info>  [1771842932.9353] dhcp4 (eth0): state changed no lease
Feb 23 10:35:32 compute-0 NetworkManager[7689]: <info>  [1771842932.9354] manager: NetworkManager state is now CONNECTED_SITE
Feb 23 10:35:32 compute-0 NetworkManager[7689]: <info>  [1771842932.9434] exiting (success)
Feb 23 10:35:32 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 10:35:32 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 10:35:32 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 23 10:35:32 compute-0 systemd[1]: Stopped Network Manager.
Feb 23 10:35:32 compute-0 systemd[1]: NetworkManager.service: Consumed 10.128s CPU time, 4.1M memory peak, read 0B from disk, written 27.5K to disk.
Feb 23 10:35:32 compute-0 systemd[1]: Starting Network Manager...
Feb 23 10:35:32 compute-0 NetworkManager[57207]: <info>  [1771842932.9828] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:9e288ccf-ad11-4627-abc5-9df48b7c9713)
Feb 23 10:35:32 compute-0 NetworkManager[57207]: <info>  [1771842932.9829] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 23 10:35:32 compute-0 NetworkManager[57207]: <info>  [1771842932.9864] manager[0x562a8316d000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 23 10:35:32 compute-0 systemd[1]: Starting Hostname Service...
Feb 23 10:35:33 compute-0 systemd[1]: Started Hostname Service.
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0659] hostname: hostname: using hostnamed
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0661] hostname: static hostname changed from (none) to "compute-0"
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0665] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0668] manager[0x562a8316d000]: rfkill: Wi-Fi hardware radio set enabled
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0669] manager[0x562a8316d000]: rfkill: WWAN hardware radio set enabled
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0685] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0692] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0693] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0693] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0694] manager: Networking is enabled by state file
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0696] settings: Loaded settings plugin: keyfile (internal)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0698] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0722] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0729] dhcp: init: Using DHCP client 'internal'
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0731] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0735] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0740] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0746] device (lo): Activation: starting connection 'lo' (c3e17fe3-3502-4aa2-b43f-fdb973524017)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0751] device (eth0): carrier: link connected
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0754] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0758] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0758] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0763] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0769] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0773] device (eth1): carrier: link connected
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0776] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0780] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (2157fe3c-5bc1-52fd-87e4-427c4a0eb10c) (indicated)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0780] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0783] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0789] device (eth1): Activation: starting connection 'ci-private-network' (2157fe3c-5bc1-52fd-87e4-427c4a0eb10c)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0793] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 23 10:35:33 compute-0 systemd[1]: Started Network Manager.
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0800] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0805] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0807] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0810] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0813] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0816] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0820] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0824] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0831] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0835] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0841] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0853] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0863] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0864] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0869] device (lo): Activation: successful, device activated.
Feb 23 10:35:33 compute-0 systemd[1]: Starting Network Manager Wait Online...
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0902] dhcp4 (eth0): state changed new lease, address=38.102.83.199
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0910] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0979] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0983] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0988] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0990] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.0993] device (eth1): Activation: successful, device activated.
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.1023] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.1024] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.1028] manager: NetworkManager state is now CONNECTED_SITE
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.1030] device (eth0): Activation: successful, device activated.
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.1034] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 23 10:35:33 compute-0 NetworkManager[57207]: <info>  [1771842933.1072] manager: startup complete
Feb 23 10:35:33 compute-0 systemd[1]: Finished Network Manager Wait Online.
Feb 23 10:35:33 compute-0 sudo[57192]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:33 compute-0 sudo[57420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyjpgczphufrxkfbhvpsaotdnfpmtnhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842933.3581069-315-269725444922876/AnsiballZ_dnf.py'
Feb 23 10:35:33 compute-0 sudo[57420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:33 compute-0 python3.9[57423]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:35:38 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:35:38 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:35:38 compute-0 systemd[1]: Reloading.
Feb 23 10:35:38 compute-0 systemd-sysv-generator[57475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:35:38 compute-0 systemd-rc-local-generator[57472]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:35:38 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:35:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:35:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:35:38 compute-0 systemd[1]: run-r5134a053b32049aca994cb4cf11a1419.service: Deactivated successfully.
Feb 23 10:35:39 compute-0 sudo[57420]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:39 compute-0 sudo[57899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuuovkgqyotnlgwxwrdrkxgqwdgciudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842939.60699-339-89281706658395/AnsiballZ_stat.py'
Feb 23 10:35:39 compute-0 sudo[57899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:40 compute-0 python3.9[57902]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:35:40 compute-0 sudo[57899]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:40 compute-0 sudo[58052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjdektfkhnnbkgtxodkaatyiphtuhekd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842940.3607843-357-194391932225220/AnsiballZ_ini_file.py'
Feb 23 10:35:40 compute-0 sudo[58052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:40 compute-0 python3.9[58055]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:40 compute-0 sudo[58052]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:41 compute-0 sudo[58207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoymiovmkctuvgxqgwvojgltmrhipcxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842941.4092627-377-162867313382017/AnsiballZ_ini_file.py'
Feb 23 10:35:41 compute-0 sudo[58207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:41 compute-0 python3.9[58210]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:41 compute-0 sudo[58207]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:42 compute-0 sudo[58360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzczcuubpctitvqgkevmhnthcukvxdbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842942.0083065-377-205441988469516/AnsiballZ_ini_file.py'
Feb 23 10:35:42 compute-0 sudo[58360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:42 compute-0 python3.9[58363]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:42 compute-0 sudo[58360]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:43 compute-0 sudo[58513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igprgkwyvgzzahayonedmfmrtxfliwzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842942.7982001-407-95566059676899/AnsiballZ_ini_file.py'
Feb 23 10:35:43 compute-0 sudo[58513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:43 compute-0 python3.9[58516]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:43 compute-0 sudo[58513]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:43 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 10:35:43 compute-0 sudo[58666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shuqwfomuuyapgrlsmnyxliiexhouiac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842943.2851443-407-76239522734651/AnsiballZ_ini_file.py'
Feb 23 10:35:43 compute-0 sudo[58666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:43 compute-0 python3.9[58669]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:43 compute-0 sudo[58666]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:44 compute-0 sudo[58819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ompqgbvuqkkptrjlbqtflgqchrixfdba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842944.19829-437-38997396750652/AnsiballZ_stat.py'
Feb 23 10:35:44 compute-0 sudo[58819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:44 compute-0 python3.9[58822]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:35:44 compute-0 sudo[58819]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:45 compute-0 sudo[58943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqjblreqokhrxqhmxlcjdqrpflecjdva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842944.19829-437-38997396750652/AnsiballZ_copy.py'
Feb 23 10:35:45 compute-0 sudo[58943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:45 compute-0 python3.9[58946]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771842944.19829-437-38997396750652/.source _original_basename=.p0imuc_m follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:45 compute-0 sudo[58943]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:45 compute-0 sudo[59096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmvyzkqkqzabmwtdepvoikwtqzxpqbiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842945.4373865-467-157080994570997/AnsiballZ_file.py'
Feb 23 10:35:45 compute-0 sudo[59096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:45 compute-0 python3.9[59099]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:45 compute-0 sudo[59096]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:46 compute-0 sudo[59249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fltfyfoehitvgtkzyadfznfmnajrtwuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842946.0633981-483-82186134671623/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 23 10:35:46 compute-0 sudo[59249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:46 compute-0 python3.9[59252]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 23 10:35:46 compute-0 sudo[59249]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:47 compute-0 sudo[59402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqfiedjlovvtpgitehcgeqlmhistzxeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842946.988272-501-11252715813591/AnsiballZ_file.py'
Feb 23 10:35:47 compute-0 sudo[59402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:47 compute-0 python3.9[59405]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:47 compute-0 sudo[59402]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:48 compute-0 sudo[59555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cumqicfzedbdpreagprsidladiekloag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842947.8141983-521-190743221829313/AnsiballZ_stat.py'
Feb 23 10:35:48 compute-0 sudo[59555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:48 compute-0 sudo[59555]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:48 compute-0 sudo[59679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogzpymlpbazhrejkecpsebtzovbtresn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842947.8141983-521-190743221829313/AnsiballZ_copy.py'
Feb 23 10:35:48 compute-0 sudo[59679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:48 compute-0 sudo[59679]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:49 compute-0 sudo[59832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zytvajyaemekyztjdhgrwqdyiyvcbgkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842948.9540863-551-181181920422950/AnsiballZ_slurp.py'
Feb 23 10:35:49 compute-0 sudo[59832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:49 compute-0 python3.9[59835]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 23 10:35:49 compute-0 sudo[59832]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:50 compute-0 sudo[60008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqmavktyinmtrtnkolzwdzofwohujede ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842949.8046458-569-218613045148042/async_wrapper.py j437036268357 300 /home/zuul/.ansible/tmp/ansible-tmp-1771842949.8046458-569-218613045148042/AnsiballZ_edpm_os_net_config.py _'
Feb 23 10:35:50 compute-0 sudo[60008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:50 compute-0 ansible-async_wrapper.py[60011]: Invoked with j437036268357 300 /home/zuul/.ansible/tmp/ansible-tmp-1771842949.8046458-569-218613045148042/AnsiballZ_edpm_os_net_config.py _
Feb 23 10:35:50 compute-0 ansible-async_wrapper.py[60014]: Starting module and watcher
Feb 23 10:35:50 compute-0 ansible-async_wrapper.py[60014]: Start watching 60015 (300)
Feb 23 10:35:50 compute-0 ansible-async_wrapper.py[60015]: Start module (60015)
Feb 23 10:35:50 compute-0 ansible-async_wrapper.py[60011]: Return async_wrapper task started.
Feb 23 10:35:50 compute-0 sudo[60008]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:50 compute-0 python3.9[60016]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 23 10:35:51 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 23 10:35:51 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 23 10:35:51 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 23 10:35:51 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 23 10:35:51 compute-0 kernel: cfg80211: failed to load regulatory.db
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2315] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2335] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2747] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2749] audit: op="connection-add" uuid="989c3e5f-d008-4f90-959e-1de0d86fea8f" name="br-ex-br" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2764] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2767] audit: op="connection-add" uuid="27bde5d7-7172-4452-a62d-3d6526d7d11e" name="br-ex-port" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2780] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2782] audit: op="connection-add" uuid="d1ad0ca5-d8dc-4d29-8a59-7181fddb0f2e" name="eth1-port" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2793] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2795] audit: op="connection-add" uuid="fb9b64ef-7c60-44fc-a31f-0d50c04228c8" name="vlan20-port" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2806] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2808] audit: op="connection-add" uuid="2333d936-bf92-4987-bd9a-073c9836780c" name="vlan21-port" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2820] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2822] audit: op="connection-add" uuid="a1a958a9-8a29-4a6e-8799-b9fab4991988" name="vlan22-port" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2841] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2859] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2861] audit: op="connection-add" uuid="479290b7-cfeb-4c7a-988f-a6703bb1f76a" name="br-ex-if" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2913] audit: op="connection-update" uuid="2157fe3c-5bc1-52fd-87e4-427c4a0eb10c" name="ci-private-network" args="ipv6.method,ipv6.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routes,ovs-external-ids.data,connection.port-type,connection.controller,connection.master,connection.timestamp,connection.slave-type,ovs-interface.type,ipv4.method,ipv4.routing-rules,ipv4.dns,ipv4.addresses,ipv4.never-default,ipv4.routes" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2930] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2932] audit: op="connection-add" uuid="968d6610-c535-4654-a551-cfe4cb570ea9" name="vlan20-if" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2950] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2952] audit: op="connection-add" uuid="e59f7e29-7d8d-473d-9ad9-dbe44a93795d" name="vlan21-if" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2968] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2971] audit: op="connection-add" uuid="0146d7a6-381c-4c4b-9fbc-4bc7d7e0f1e1" name="vlan22-if" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2980] audit: op="connection-delete" uuid="a3dcef92-ee93-3e7f-aa26-1790b6720768" name="Wired connection 1" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.2990] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.2993] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3000] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3005] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (989c3e5f-d008-4f90-959e-1de0d86fea8f)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3006] audit: op="connection-activate" uuid="989c3e5f-d008-4f90-959e-1de0d86fea8f" name="br-ex-br" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3009] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3011] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3016] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3020] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (27bde5d7-7172-4452-a62d-3d6526d7d11e)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3023] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3024] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3029] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3033] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (d1ad0ca5-d8dc-4d29-8a59-7181fddb0f2e)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3035] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3037] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3042] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3047] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (fb9b64ef-7c60-44fc-a31f-0d50c04228c8)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3049] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3051] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3056] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3060] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (2333d936-bf92-4987-bd9a-073c9836780c)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3062] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3064] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3069] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3074] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (a1a958a9-8a29-4a6e-8799-b9fab4991988)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3075] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3078] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3080] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3087] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3088] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3092] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3097] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (479290b7-cfeb-4c7a-988f-a6703bb1f76a)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3098] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3101] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3103] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3105] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3107] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3119] device (eth1): disconnecting for new activation request.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3120] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3124] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3127] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3130] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3134] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3136] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3142] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3148] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (968d6610-c535-4654-a551-cfe4cb570ea9)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3150] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3155] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3158] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3161] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3165] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3168] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3173] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3179] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (e59f7e29-7d8d-473d-9ad9-dbe44a93795d)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3180] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3184] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3187] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3189] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3192] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <warn>  [1771842952.3193] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3197] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3202] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (0146d7a6-381c-4c4b-9fbc-4bc7d7e0f1e1)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3203] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3206] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3209] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3211] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3213] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3228] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3230] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3234] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3236] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3244] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3248] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 kernel: ovs-system: entered promiscuous mode
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3255] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3261] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3263] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3268] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3271] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3274] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 systemd-udevd[60021]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3276] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3280] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3283] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 kernel: Timeout policy base is empty
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3286] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3288] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3292] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3296] dhcp4 (eth0): canceled DHCP transaction
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3296] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3296] dhcp4 (eth0): state changed no lease
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3298] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 23 10:35:52 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3306] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3309] audit: op="device-reapply" interface="eth1" ifindex=3 pid=60017 uid=0 result="fail" reason="Device is not activated"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3347] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3351] dhcp4 (eth0): state changed new lease, address=38.102.83.199
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3387] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3394] device (eth1): disconnecting for new activation request.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3394] audit: op="connection-activate" uuid="2157fe3c-5bc1-52fd-87e4-427c4a0eb10c" name="ci-private-network" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3395] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 23 10:35:52 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3415] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=60017 uid=0 result="success"
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3450] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 23 10:35:52 compute-0 kernel: br-ex: entered promiscuous mode
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3562] device (eth1): Activation: starting connection 'ci-private-network' (2157fe3c-5bc1-52fd-87e4-427c4a0eb10c)
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3567] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3575] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3578] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3582] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3585] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3592] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3593] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3594] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3595] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3603] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3615] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3621] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3624] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3627] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3629] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3632] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3635] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3639] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3642] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3645] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3648] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3653] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3655] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3659] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3675] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3682] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 kernel: vlan22: entered promiscuous mode
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3690] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3694] device (eth1): Activation: successful, device activated.
Feb 23 10:35:52 compute-0 kernel: vlan20: entered promiscuous mode
Feb 23 10:35:52 compute-0 systemd-udevd[60020]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3757] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3761] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3763] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3767] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3785] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 kernel: vlan21: entered promiscuous mode
Feb 23 10:35:52 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3824] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3832] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3842] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3843] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3847] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3856] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3859] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3861] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3864] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3878] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3914] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3915] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 23 10:35:52 compute-0 NetworkManager[57207]: <info>  [1771842952.3920] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 23 10:35:53 compute-0 NetworkManager[57207]: <info>  [1771842953.4814] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=60017 uid=0 result="success"
Feb 23 10:35:53 compute-0 NetworkManager[57207]: <info>  [1771842953.5818] checkpoint[0x562a83141950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 23 10:35:53 compute-0 NetworkManager[57207]: <info>  [1771842953.5820] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=60017 uid=0 result="success"
Feb 23 10:35:53 compute-0 NetworkManager[57207]: <info>  [1771842953.8104] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=60017 uid=0 result="success"
Feb 23 10:35:53 compute-0 NetworkManager[57207]: <info>  [1771842953.8115] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=60017 uid=0 result="success"
Feb 23 10:35:54 compute-0 NetworkManager[57207]: <info>  [1771842954.1102] audit: op="networking-control" arg="global-dns-configuration" pid=60017 uid=0 result="success"
Feb 23 10:35:54 compute-0 NetworkManager[57207]: <info>  [1771842954.1140] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 23 10:35:54 compute-0 NetworkManager[57207]: <info>  [1771842954.1165] audit: op="networking-control" arg="global-dns-configuration" pid=60017 uid=0 result="success"
Feb 23 10:35:54 compute-0 NetworkManager[57207]: <info>  [1771842954.1179] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=60017 uid=0 result="success"
Feb 23 10:35:54 compute-0 sudo[60353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnswzzceefzdeyaxlwfdblyskjatwdzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842953.745147-569-163958106394232/AnsiballZ_async_status.py'
Feb 23 10:35:54 compute-0 sudo[60353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:54 compute-0 NetworkManager[57207]: <info>  [1771842954.2408] checkpoint[0x562a83141a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 23 10:35:54 compute-0 NetworkManager[57207]: <info>  [1771842954.2412] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=60017 uid=0 result="success"
Feb 23 10:35:54 compute-0 ansible-async_wrapper.py[60015]: Module complete (60015)
Feb 23 10:35:54 compute-0 python3.9[60356]: ansible-ansible.legacy.async_status Invoked with jid=j437036268357.60011 mode=status _async_dir=/root/.ansible_async
Feb 23 10:35:54 compute-0 sudo[60353]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:54 compute-0 sudo[60454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvdaqqycexmcdubylliltanmyzkfboew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842953.745147-569-163958106394232/AnsiballZ_async_status.py'
Feb 23 10:35:54 compute-0 sudo[60454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:54 compute-0 python3.9[60457]: ansible-ansible.legacy.async_status Invoked with jid=j437036268357.60011 mode=cleanup _async_dir=/root/.ansible_async
Feb 23 10:35:54 compute-0 sudo[60454]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:55 compute-0 sudo[60607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lodmxqchikqahdnpsjglyezvrksxnirm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842955.0892031-613-193682389542713/AnsiballZ_stat.py'
Feb 23 10:35:55 compute-0 sudo[60607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:55 compute-0 python3.9[60610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:35:55 compute-0 sudo[60607]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:55 compute-0 ansible-async_wrapper.py[60014]: Done in kid B.
Feb 23 10:35:55 compute-0 sudo[60731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raazgositgwjjjiwpugftmstyqmjbqqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842955.0892031-613-193682389542713/AnsiballZ_copy.py'
Feb 23 10:35:55 compute-0 sudo[60731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:56 compute-0 python3.9[60734]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771842955.0892031-613-193682389542713/.source.returncode _original_basename=.r95mx1qi follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:56 compute-0 sudo[60731]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:56 compute-0 sudo[60884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipdeepwfxmgmvtgpmmgcqeqmwatpkkkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842956.2861366-645-52409886285291/AnsiballZ_stat.py'
Feb 23 10:35:56 compute-0 sudo[60884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:56 compute-0 python3.9[60887]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:35:56 compute-0 sudo[60884]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:57 compute-0 sudo[61008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irmqptkjwhvvfozjdyvqvavocvuayizz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842956.2861366-645-52409886285291/AnsiballZ_copy.py'
Feb 23 10:35:57 compute-0 sudo[61008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:57 compute-0 python3.9[61011]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771842956.2861366-645-52409886285291/.source.cfg _original_basename=.p_hqa5b4 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:35:57 compute-0 sudo[61008]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:57 compute-0 sudo[61161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwwybbbgammuutylttlesycdjvhhvubv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842957.5204263-675-109429454089814/AnsiballZ_systemd.py'
Feb 23 10:35:57 compute-0 sudo[61161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:35:58 compute-0 python3.9[61165]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:35:58 compute-0 systemd[1]: Reloading Network Manager...
Feb 23 10:35:58 compute-0 NetworkManager[57207]: <info>  [1771842958.1143] audit: op="reload" arg="0" pid=61169 uid=0 result="success"
Feb 23 10:35:58 compute-0 NetworkManager[57207]: <info>  [1771842958.1148] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 23 10:35:58 compute-0 systemd[1]: Reloaded Network Manager.
Feb 23 10:35:58 compute-0 sudo[61161]: pam_unix(sudo:session): session closed for user root
Feb 23 10:35:58 compute-0 sshd-session[53153]: Connection closed by 192.168.122.30 port 35212
Feb 23 10:35:58 compute-0 sshd-session[53150]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:35:58 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Feb 23 10:35:58 compute-0 systemd[1]: session-12.scope: Consumed 40.872s CPU time.
Feb 23 10:35:58 compute-0 systemd-logind[808]: Session 12 logged out. Waiting for processes to exit.
Feb 23 10:35:58 compute-0 systemd-logind[808]: Removed session 12.
Feb 23 10:36:02 compute-0 sshd-session[61199]: Connection closed by authenticating user root 143.198.30.3 port 33766 [preauth]
Feb 23 10:36:03 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 10:36:04 compute-0 sshd-session[61203]: Accepted publickey for zuul from 192.168.122.30 port 50330 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:36:04 compute-0 systemd-logind[808]: New session 13 of user zuul.
Feb 23 10:36:04 compute-0 systemd[1]: Started Session 13 of User zuul.
Feb 23 10:36:04 compute-0 sshd-session[61203]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:36:05 compute-0 python3.9[61358]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:36:05 compute-0 sshd-session[61204]: Invalid user mysql from 185.156.73.233 port 23930
Feb 23 10:36:06 compute-0 sshd-session[61204]: Connection closed by invalid user mysql 185.156.73.233 port 23930 [preauth]
Feb 23 10:36:06 compute-0 python3.9[61513]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:36:07 compute-0 python3.9[61702]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:36:07 compute-0 sshd-session[61207]: Connection closed by 192.168.122.30 port 50330
Feb 23 10:36:07 compute-0 sshd-session[61203]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:36:07 compute-0 systemd-logind[808]: Session 13 logged out. Waiting for processes to exit.
Feb 23 10:36:07 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Feb 23 10:36:07 compute-0 systemd[1]: session-13.scope: Consumed 1.884s CPU time.
Feb 23 10:36:07 compute-0 systemd-logind[808]: Removed session 13.
Feb 23 10:36:08 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 10:36:13 compute-0 sshd-session[61732]: Accepted publickey for zuul from 192.168.122.30 port 41562 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:36:13 compute-0 systemd-logind[808]: New session 14 of user zuul.
Feb 23 10:36:13 compute-0 systemd[1]: Started Session 14 of User zuul.
Feb 23 10:36:13 compute-0 sshd-session[61732]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:36:14 compute-0 python3.9[61886]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:36:14 compute-0 sshd-session[61887]: Connection closed by authenticating user root 165.227.79.48 port 35358 [preauth]
Feb 23 10:36:15 compute-0 python3.9[62042]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:36:15 compute-0 sudo[62196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wimthrpqeffvwenwgnirrxwaxzcyepso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842975.5438147-55-148313151319473/AnsiballZ_setup.py'
Feb 23 10:36:15 compute-0 sudo[62196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:16 compute-0 python3.9[62199]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:36:16 compute-0 sudo[62196]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:16 compute-0 sudo[62281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alufbexhskgkobwxcdsjanvgoskvbsqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842975.5438147-55-148313151319473/AnsiballZ_dnf.py'
Feb 23 10:36:16 compute-0 sudo[62281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:16 compute-0 python3.9[62284]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:36:18 compute-0 sudo[62281]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:18 compute-0 sudo[62436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfxucfrkdjhlilfhbqrbbchfphrtnajr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842978.2728722-79-113849233251434/AnsiballZ_setup.py'
Feb 23 10:36:18 compute-0 sudo[62436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:18 compute-0 python3.9[62439]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:36:19 compute-0 sudo[62436]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:19 compute-0 sudo[62628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahpxnamifdbsowbrieysfscxxvknfcsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842979.4082265-101-15506896019094/AnsiballZ_file.py'
Feb 23 10:36:19 compute-0 sudo[62628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:20 compute-0 python3.9[62631]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:36:20 compute-0 sudo[62628]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:20 compute-0 sudo[62782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgnfnjwughwltlsthogzqgowncmovhiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842980.310502-117-179960467482794/AnsiballZ_command.py'
Feb 23 10:36:20 compute-0 sudo[62782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:20 compute-0 python3.9[62785]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:36:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:36:20 compute-0 sudo[62782]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:22 compute-0 sudo[62946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdniteieotcbilzhqdcatqyvbhtaewbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842981.4294045-133-262285886133811/AnsiballZ_stat.py'
Feb 23 10:36:22 compute-0 sudo[62946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:22 compute-0 python3.9[62949]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:36:22 compute-0 sudo[62946]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:22 compute-0 sudo[63025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iijjzrtytyzdtznltscnfxryqbfsepse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842981.4294045-133-262285886133811/AnsiballZ_file.py'
Feb 23 10:36:22 compute-0 sudo[63025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:23 compute-0 python3.9[63028]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:36:23 compute-0 sudo[63025]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:23 compute-0 sudo[63179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkkzqckbsgkrfpspqiadulcbdefadxzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842983.3703103-157-208160368787167/AnsiballZ_stat.py'
Feb 23 10:36:23 compute-0 sudo[63179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:23 compute-0 python3.9[63182]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:36:23 compute-0 sudo[63179]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:24 compute-0 sudo[63258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwsyasnpbulaqoiaclloruliklzpgxai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842983.3703103-157-208160368787167/AnsiballZ_file.py'
Feb 23 10:36:24 compute-0 sudo[63258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:24 compute-0 python3.9[63261]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:36:24 compute-0 sudo[63258]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:25 compute-0 sudo[63411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jswboynmlkylgyjbouinatidwtnehnps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842984.7691534-183-130791012206812/AnsiballZ_ini_file.py'
Feb 23 10:36:25 compute-0 sudo[63411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:25 compute-0 python3.9[63414]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:36:25 compute-0 sudo[63411]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:25 compute-0 sudo[63564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcjpxpjpqgifamhfzrphzkbzsgtpcuno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842985.5160255-183-226337409626024/AnsiballZ_ini_file.py'
Feb 23 10:36:25 compute-0 sudo[63564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:25 compute-0 python3.9[63567]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:36:25 compute-0 sudo[63564]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:26 compute-0 sudo[63717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfvvhyuzxexhmqnnjffrywwqyqredppg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842986.0697758-183-244082911759625/AnsiballZ_ini_file.py'
Feb 23 10:36:26 compute-0 sudo[63717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:26 compute-0 python3.9[63720]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:36:26 compute-0 sudo[63717]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:26 compute-0 sudo[63870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewwjpzbxrcefkojvwpjzjyignljdmzjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842986.5875804-183-48918575674045/AnsiballZ_ini_file.py'
Feb 23 10:36:26 compute-0 sudo[63870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:27 compute-0 python3.9[63873]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:36:27 compute-0 sudo[63870]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:27 compute-0 sudo[64023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpucvohibamhrmihbmhjrxoodcxwaujm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842987.4958858-245-239260691948189/AnsiballZ_dnf.py'
Feb 23 10:36:27 compute-0 sudo[64023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:27 compute-0 python3.9[64026]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:36:29 compute-0 sudo[64023]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:29 compute-0 sudo[64177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgobyukyuscaglwnrbpfpsnhqrufvccv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842989.7040768-267-152169692635030/AnsiballZ_setup.py'
Feb 23 10:36:29 compute-0 sudo[64177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:30 compute-0 python3.9[64180]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:36:30 compute-0 sudo[64177]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:30 compute-0 sudo[64332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjdhrbckdhumbzzrjltqvzwwlyfpcfnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842990.5645072-283-48523331026873/AnsiballZ_stat.py'
Feb 23 10:36:30 compute-0 sudo[64332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:31 compute-0 python3.9[64335]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:36:31 compute-0 sudo[64332]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:31 compute-0 sudo[64485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioacktjoionnaxpeznnqqzmfnzqfzboy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842991.3490217-301-88497158211984/AnsiballZ_stat.py'
Feb 23 10:36:31 compute-0 sudo[64485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:31 compute-0 python3.9[64488]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:36:31 compute-0 sudo[64485]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:32 compute-0 sudo[64638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsimyvvkhlfxdoekaykcwiextqzezgiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842992.1960106-321-150849020328813/AnsiballZ_command.py'
Feb 23 10:36:32 compute-0 sudo[64638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:32 compute-0 python3.9[64641]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:36:32 compute-0 sudo[64638]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:33 compute-0 sudo[64792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpdpcdzwkxwhvjwierfpodjnrvdnbgww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842993.1097467-341-273434841327242/AnsiballZ_service_facts.py'
Feb 23 10:36:33 compute-0 sudo[64792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:33 compute-0 python3.9[64795]: ansible-service_facts Invoked
Feb 23 10:36:33 compute-0 network[64812]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 10:36:33 compute-0 network[64813]: 'network-scripts' will be removed from distribution in near future.
Feb 23 10:36:33 compute-0 network[64814]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 10:36:35 compute-0 sudo[64792]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:36 compute-0 sshd-session[64950]: Connection closed by authenticating user root 143.198.30.3 port 35128 [preauth]
Feb 23 10:36:38 compute-0 sudo[65100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcygwkhrdkilgpkdyjketkvrvknvorkz ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771842998.2052777-371-205699070330730/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771842998.2052777-371-205699070330730/args'
Feb 23 10:36:38 compute-0 sudo[65100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:38 compute-0 sudo[65100]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:39 compute-0 sudo[65268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opmwouvzfimkeeinfyiyiodpxduilpgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771842999.0476427-393-121063603476316/AnsiballZ_dnf.py'
Feb 23 10:36:39 compute-0 sudo[65268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:39 compute-0 python3.9[65271]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:36:40 compute-0 sudo[65268]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:41 compute-0 sudo[65422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohynfxpxsmodtdcsxxoxummhhkkolwoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843001.1882377-419-236037940694419/AnsiballZ_package_facts.py'
Feb 23 10:36:41 compute-0 sudo[65422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:42 compute-0 python3.9[65425]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 23 10:36:42 compute-0 sudo[65422]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:43 compute-0 sudo[65575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmhzjaysuayuippzgnrfuqbdvaxprpjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843003.05287-439-230184008257642/AnsiballZ_stat.py'
Feb 23 10:36:43 compute-0 sudo[65575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:43 compute-0 python3.9[65578]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:36:43 compute-0 sudo[65575]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:44 compute-0 sudo[65701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfgofijpmnbdmymnegggrvxrckritaji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843003.05287-439-230184008257642/AnsiballZ_copy.py'
Feb 23 10:36:44 compute-0 sudo[65701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:44 compute-0 python3.9[65704]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843003.05287-439-230184008257642/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:36:44 compute-0 sudo[65701]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:45 compute-0 sudo[65856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhajokfdycsxqlisbuefbwjmjqzuxvdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843004.748246-469-205519291441366/AnsiballZ_stat.py'
Feb 23 10:36:45 compute-0 sudo[65856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:45 compute-0 python3.9[65859]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:36:45 compute-0 sudo[65856]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:45 compute-0 sudo[65982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efqmrcsxwuxgvsopsdparnygszeckjdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843004.748246-469-205519291441366/AnsiballZ_copy.py'
Feb 23 10:36:45 compute-0 sudo[65982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:45 compute-0 python3.9[65985]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843004.748246-469-205519291441366/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:36:45 compute-0 sudo[65982]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:46 compute-0 sudo[66137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddoehqeprfauigbrnpojzfsyeihquvjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843006.548982-511-263266212426485/AnsiballZ_lineinfile.py'
Feb 23 10:36:46 compute-0 sudo[66137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:47 compute-0 python3.9[66140]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:36:47 compute-0 sudo[66137]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:48 compute-0 sudo[66292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcpzzlofiofvgazpxlrxxmjnvmgndtru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843008.2492719-541-274076207873729/AnsiballZ_setup.py'
Feb 23 10:36:48 compute-0 sudo[66292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:48 compute-0 python3.9[66295]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:36:49 compute-0 sudo[66292]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:49 compute-0 sudo[66377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-refolplfnvscxwvgvrerxlcbspqruzsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843008.2492719-541-274076207873729/AnsiballZ_systemd.py'
Feb 23 10:36:49 compute-0 sudo[66377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:49 compute-0 python3.9[66380]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:36:49 compute-0 sudo[66377]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:50 compute-0 sudo[66532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhsynxusjjagtriicsycqgpveaqzvmfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843010.6608274-573-92194647413390/AnsiballZ_setup.py'
Feb 23 10:36:50 compute-0 sudo[66532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:51 compute-0 python3.9[66535]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:36:51 compute-0 sudo[66532]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:51 compute-0 sudo[66617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzbxyfjyumygelmlaearwtiihbaabxlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843010.6608274-573-92194647413390/AnsiballZ_systemd.py'
Feb 23 10:36:51 compute-0 sudo[66617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:52 compute-0 python3.9[66620]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:36:52 compute-0 chronyd[807]: chronyd exiting
Feb 23 10:36:52 compute-0 systemd[1]: Stopping NTP client/server...
Feb 23 10:36:52 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Feb 23 10:36:52 compute-0 systemd[1]: Stopped NTP client/server.
Feb 23 10:36:52 compute-0 systemd[1]: Starting NTP client/server...
Feb 23 10:36:52 compute-0 chronyd[66628]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 23 10:36:52 compute-0 chronyd[66628]: Frequency -26.716 +/- 0.422 ppm read from /var/lib/chrony/drift
Feb 23 10:36:52 compute-0 chronyd[66628]: Loaded seccomp filter (level 2)
Feb 23 10:36:52 compute-0 systemd[1]: Started NTP client/server.
Feb 23 10:36:52 compute-0 sudo[66617]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:52 compute-0 sshd-session[61735]: Connection closed by 192.168.122.30 port 41562
Feb 23 10:36:52 compute-0 sshd-session[61732]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:36:52 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Feb 23 10:36:52 compute-0 systemd[1]: session-14.scope: Consumed 21.378s CPU time.
Feb 23 10:36:52 compute-0 systemd-logind[808]: Session 14 logged out. Waiting for processes to exit.
Feb 23 10:36:52 compute-0 systemd-logind[808]: Removed session 14.
Feb 23 10:36:57 compute-0 sshd-session[66654]: Accepted publickey for zuul from 192.168.122.30 port 41164 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:36:57 compute-0 systemd-logind[808]: New session 15 of user zuul.
Feb 23 10:36:57 compute-0 systemd[1]: Started Session 15 of User zuul.
Feb 23 10:36:57 compute-0 sshd-session[66654]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:36:58 compute-0 python3.9[66807]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:36:59 compute-0 sudo[66961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkysoiygmjbrrnliizgrrvbieodpyzad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843019.2474597-41-73058789999612/AnsiballZ_file.py'
Feb 23 10:36:59 compute-0 sudo[66961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:36:59 compute-0 python3.9[66964]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:36:59 compute-0 sudo[66961]: pam_unix(sudo:session): session closed for user root
Feb 23 10:36:59 compute-0 sshd-session[66971]: Connection closed by 45.148.10.240 port 51608
Feb 23 10:37:00 compute-0 sudo[67138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhydzokaghkploetiormmczvadfurxjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843020.0626574-57-97696278123860/AnsiballZ_stat.py'
Feb 23 10:37:00 compute-0 sudo[67138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:00 compute-0 python3.9[67141]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:00 compute-0 sudo[67138]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:00 compute-0 sudo[67217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxytgtnacqxkjyguodszjllwdxvkocct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843020.0626574-57-97696278123860/AnsiballZ_file.py'
Feb 23 10:37:00 compute-0 sudo[67217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:01 compute-0 python3.9[67220]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.tp97f6lg recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:01 compute-0 sudo[67217]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:01 compute-0 sshd-session[67245]: Connection closed by authenticating user root 165.227.79.48 port 52106 [preauth]
Feb 23 10:37:02 compute-0 sudo[67372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cotpjxfgqmemhuwxkgakfbuvlgpwpvdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843022.1813204-97-117881679371353/AnsiballZ_stat.py'
Feb 23 10:37:02 compute-0 sudo[67372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:02 compute-0 python3.9[67375]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:02 compute-0 sudo[67372]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:03 compute-0 sudo[67496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkbeecfnigaaldeagmytbsbpvqdtcfoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843022.1813204-97-117881679371353/AnsiballZ_copy.py'
Feb 23 10:37:03 compute-0 sudo[67496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:03 compute-0 python3.9[67499]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843022.1813204-97-117881679371353/.source _original_basename=.gl96l95n follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:03 compute-0 sudo[67496]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:03 compute-0 sudo[67649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djlykqhxcdojbptjswdwxzjuahvswnda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843023.47412-129-224598388461057/AnsiballZ_file.py'
Feb 23 10:37:03 compute-0 sudo[67649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:03 compute-0 python3.9[67652]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:37:03 compute-0 sudo[67649]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:04 compute-0 sudo[67802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwvfstyluebzbezubdgrfjbfvpqetall ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843024.1412194-145-202134042052128/AnsiballZ_stat.py'
Feb 23 10:37:04 compute-0 sudo[67802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:04 compute-0 python3.9[67805]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:04 compute-0 sudo[67802]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:04 compute-0 sudo[67926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvpdwdvfcrvffbkppdagglyzqtjucorb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843024.1412194-145-202134042052128/AnsiballZ_copy.py'
Feb 23 10:37:04 compute-0 sudo[67926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:05 compute-0 python3.9[67929]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843024.1412194-145-202134042052128/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:37:05 compute-0 sudo[67926]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:05 compute-0 sudo[68079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sovgweqspubjkaboesezcrvarukajvqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843025.2039988-145-2671096329888/AnsiballZ_stat.py'
Feb 23 10:37:05 compute-0 sudo[68079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:05 compute-0 python3.9[68082]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:05 compute-0 sudo[68079]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:05 compute-0 sudo[68203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snlxuohxalyglmlikouilwjvrxeqvzgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843025.2039988-145-2671096329888/AnsiballZ_copy.py'
Feb 23 10:37:05 compute-0 sudo[68203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:06 compute-0 python3.9[68206]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843025.2039988-145-2671096329888/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:37:06 compute-0 sudo[68203]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:06 compute-0 sudo[68356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuikzczusrzihwwcafgrmgoilbhyzgls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843026.4358351-203-237563226325947/AnsiballZ_file.py'
Feb 23 10:37:06 compute-0 sudo[68356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:06 compute-0 python3.9[68359]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:06 compute-0 sudo[68356]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:07 compute-0 sudo[68509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpsklvkjukgfqtwswribsjptjrrpriim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843027.1353788-219-184372474186260/AnsiballZ_stat.py'
Feb 23 10:37:07 compute-0 sudo[68509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:07 compute-0 python3.9[68512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:07 compute-0 sudo[68509]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:07 compute-0 sudo[68636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mogazasiezketpwwbzstsjasgmgpbbej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843027.1353788-219-184372474186260/AnsiballZ_copy.py'
Feb 23 10:37:07 compute-0 sudo[68636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:08 compute-0 python3.9[68639]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843027.1353788-219-184372474186260/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:08 compute-0 sudo[68636]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:08 compute-0 sudo[68790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvyfuxtoceyzyfflqnlagfkrsytammwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843028.459776-249-156992761371016/AnsiballZ_stat.py'
Feb 23 10:37:08 compute-0 sudo[68790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:08 compute-0 python3.9[68793]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:08 compute-0 sudo[68790]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:09 compute-0 sudo[68916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwcrkfjcbsavtpzptqcxixysiawudfhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843028.459776-249-156992761371016/AnsiballZ_copy.py'
Feb 23 10:37:09 compute-0 sudo[68916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:09 compute-0 sshd-session[68864]: Connection closed by authenticating user root 143.198.30.3 port 40084 [preauth]
Feb 23 10:37:09 compute-0 python3.9[68919]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843028.459776-249-156992761371016/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:09 compute-0 sudo[68916]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:09 compute-0 sshd-session[68513]: Invalid user sol from 45.148.10.240 port 53840
Feb 23 10:37:10 compute-0 sshd-session[68514]: Invalid user sol from 45.148.10.240 port 53850
Feb 23 10:37:10 compute-0 sshd-session[68513]: Connection closed by invalid user sol 45.148.10.240 port 53840 [preauth]
Feb 23 10:37:10 compute-0 sshd-session[68514]: Connection closed by invalid user sol 45.148.10.240 port 53850 [preauth]
Feb 23 10:37:10 compute-0 sudo[69069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptwhzfntwidkkkukqntmhfkbvekgircs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843029.8027616-279-219708094031676/AnsiballZ_systemd.py'
Feb 23 10:37:10 compute-0 sudo[69069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:10 compute-0 python3.9[69072]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:37:10 compute-0 systemd[1]: Reloading.
Feb 23 10:37:10 compute-0 systemd-rc-local-generator[69101]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:37:10 compute-0 systemd-sysv-generator[69104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:37:10 compute-0 systemd[1]: Reloading.
Feb 23 10:37:10 compute-0 systemd-rc-local-generator[69140]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:37:10 compute-0 systemd-sysv-generator[69146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:37:11 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Feb 23 10:37:11 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Feb 23 10:37:11 compute-0 sudo[69069]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:11 compute-0 sudo[69312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bppmvhdtankvtcdzxnmlgiacvwzpdoql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843031.4446156-295-97415762755421/AnsiballZ_stat.py'
Feb 23 10:37:11 compute-0 sudo[69312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:11 compute-0 python3.9[69315]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:11 compute-0 sudo[69312]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:12 compute-0 sudo[69438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdgljdexhteojnjeetqiytclrmgpbqjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843031.4446156-295-97415762755421/AnsiballZ_copy.py'
Feb 23 10:37:12 compute-0 sudo[69438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:12 compute-0 python3.9[69441]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843031.4446156-295-97415762755421/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:12 compute-0 sudo[69438]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:12 compute-0 sshd-session[69363]: Invalid user solana from 45.148.10.240 port 53866
Feb 23 10:37:12 compute-0 sshd-session[69363]: Connection closed by invalid user solana 45.148.10.240 port 53866 [preauth]
Feb 23 10:37:13 compute-0 sudo[69593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bedorhbymcwnxqnafmetnmoclrfffncn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843032.7580113-325-27609497941121/AnsiballZ_stat.py'
Feb 23 10:37:13 compute-0 sudo[69593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:13 compute-0 python3.9[69596]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:13 compute-0 sudo[69593]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:13 compute-0 sudo[69717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vccjdmzqoavmoocqrjuzbxadnqcrygeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843032.7580113-325-27609497941121/AnsiballZ_copy.py'
Feb 23 10:37:13 compute-0 sudo[69717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:13 compute-0 python3.9[69720]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843032.7580113-325-27609497941121/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:13 compute-0 sudo[69717]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:13 compute-0 sshd-session[69466]: Invalid user solana from 45.148.10.240 port 53878
Feb 23 10:37:14 compute-0 sudo[69870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eocfkamhbuuzdmnjpbrermmuafopubxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843034.1982462-355-17627706854462/AnsiballZ_systemd.py'
Feb 23 10:37:14 compute-0 sudo[69870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:14 compute-0 python3.9[69873]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:37:14 compute-0 systemd[1]: Reloading.
Feb 23 10:37:14 compute-0 systemd-rc-local-generator[69893]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:37:14 compute-0 systemd-sysv-generator[69899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:37:14 compute-0 systemd[1]: Reloading.
Feb 23 10:37:14 compute-0 sshd-session[69466]: Connection closed by invalid user solana 45.148.10.240 port 53878 [preauth]
Feb 23 10:37:14 compute-0 systemd-rc-local-generator[69943]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:37:14 compute-0 systemd-sysv-generator[69948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:37:15 compute-0 systemd[1]: Starting Create netns directory...
Feb 23 10:37:15 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 10:37:15 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 10:37:15 compute-0 systemd[1]: Finished Create netns directory.
Feb 23 10:37:15 compute-0 sudo[69870]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:15 compute-0 sshd-session[69960]: Invalid user sol from 45.148.10.240 port 53888
Feb 23 10:37:15 compute-0 sshd-session[69960]: Connection closed by invalid user sol 45.148.10.240 port 53888 [preauth]
Feb 23 10:37:16 compute-0 python3.9[70115]: ansible-ansible.builtin.service_facts Invoked
Feb 23 10:37:16 compute-0 network[70132]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 10:37:16 compute-0 network[70133]: 'network-scripts' will be removed from distribution in near future.
Feb 23 10:37:16 compute-0 network[70134]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 10:37:17 compute-0 sshd-session[70140]: Invalid user sol from 45.148.10.240 port 35998
Feb 23 10:37:17 compute-0 sshd-session[70140]: Connection closed by invalid user sol 45.148.10.240 port 35998 [preauth]
Feb 23 10:37:18 compute-0 sshd-session[70272]: Invalid user ubuntu from 45.148.10.240 port 36010
Feb 23 10:37:18 compute-0 sudo[70399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxmdxxmshoggdrggvpklrbthblpthraw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843038.7249758-387-60212268612918/AnsiballZ_systemd.py'
Feb 23 10:37:18 compute-0 sudo[70399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:19 compute-0 sshd-session[70272]: Connection closed by invalid user ubuntu 45.148.10.240 port 36010 [preauth]
Feb 23 10:37:19 compute-0 python3.9[70402]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:37:19 compute-0 systemd[1]: Reloading.
Feb 23 10:37:19 compute-0 systemd-rc-local-generator[70426]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:37:19 compute-0 systemd-sysv-generator[70430]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:37:19 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 23 10:37:19 compute-0 iptables.init[70448]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 23 10:37:19 compute-0 iptables.init[70448]: iptables: Flushing firewall rules: [  OK  ]
Feb 23 10:37:19 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Feb 23 10:37:19 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 23 10:37:19 compute-0 sudo[70399]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:20 compute-0 sudo[70644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqvjypnunsjyoyedqawxngjipjujpqbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843039.8365207-387-259423557723545/AnsiballZ_systemd.py'
Feb 23 10:37:20 compute-0 sudo[70644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:20 compute-0 sshd-session[70493]: Invalid user ubuntu from 45.148.10.240 port 36020
Feb 23 10:37:20 compute-0 python3.9[70647]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:37:20 compute-0 sudo[70644]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:20 compute-0 sshd-session[70493]: Connection closed by invalid user ubuntu 45.148.10.240 port 36020 [preauth]
Feb 23 10:37:21 compute-0 sudo[70799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xffvmupidwrioaundblonmcawsfrymfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843041.0568006-419-93915050583645/AnsiballZ_systemd.py'
Feb 23 10:37:21 compute-0 sudo[70799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:21 compute-0 python3.9[70802]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:37:21 compute-0 systemd[1]: Reloading.
Feb 23 10:37:21 compute-0 systemd-sysv-generator[70844]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:37:21 compute-0 systemd-rc-local-generator[70839]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:37:21 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 23 10:37:21 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 23 10:37:21 compute-0 sudo[70799]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:22 compute-0 sshd-session[70803]: Invalid user ubuntu from 45.148.10.240 port 36032
Feb 23 10:37:22 compute-0 sshd-session[70803]: Connection closed by invalid user ubuntu 45.148.10.240 port 36032 [preauth]
Feb 23 10:37:22 compute-0 sudo[71001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buznhwmreqdnncoqjmapvdtrcbctrnwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843042.112254-435-182628078558345/AnsiballZ_command.py'
Feb 23 10:37:22 compute-0 sudo[71001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:22 compute-0 python3.9[71004]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:37:22 compute-0 sudo[71001]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:23 compute-0 sudo[71157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlhkdtszeuucbrvuldvpgffeueufclvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843043.288576-463-48638080066539/AnsiballZ_stat.py'
Feb 23 10:37:23 compute-0 sudo[71157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:23 compute-0 python3.9[71160]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:23 compute-0 sudo[71157]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:23 compute-0 sshd-session[71030]: Invalid user ubuntu from 45.148.10.240 port 36040
Feb 23 10:37:24 compute-0 sudo[71283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-josqretcdltzernzjnloottvmgwuxhue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843043.288576-463-48638080066539/AnsiballZ_copy.py'
Feb 23 10:37:24 compute-0 sudo[71283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:24 compute-0 sshd-session[71030]: Connection closed by invalid user ubuntu 45.148.10.240 port 36040 [preauth]
Feb 23 10:37:24 compute-0 python3.9[71286]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843043.288576-463-48638080066539/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:24 compute-0 sudo[71283]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:24 compute-0 sudo[71439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clmkvbwpqcgzaumhmrwxxqunwzgamyql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843044.621626-493-184397577456083/AnsiballZ_systemd.py'
Feb 23 10:37:24 compute-0 sudo[71439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:25 compute-0 python3.9[71442]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:37:25 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Feb 23 10:37:25 compute-0 sshd[1018]: Received SIGHUP; restarting.
Feb 23 10:37:25 compute-0 sshd[1018]: Server listening on 0.0.0.0 port 22.
Feb 23 10:37:25 compute-0 sshd[1018]: Server listening on :: port 22.
Feb 23 10:37:25 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Feb 23 10:37:25 compute-0 sudo[71439]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:25 compute-0 sudo[71596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smntlwpwwgvwsggaqeplpjqkoefmbizo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843045.569849-509-159402512431174/AnsiballZ_file.py'
Feb 23 10:37:25 compute-0 sudo[71596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:25 compute-0 python3.9[71599]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:25 compute-0 sudo[71596]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:26 compute-0 sshd-session[71312]: Invalid user sol from 45.148.10.240 port 36048
Feb 23 10:37:26 compute-0 sudo[71751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smxvwzxxvdsiglgkewpwfvzfmmajgeqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843046.2508423-525-60252760376790/AnsiballZ_stat.py'
Feb 23 10:37:26 compute-0 sudo[71751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:26 compute-0 sshd-session[71312]: Connection closed by invalid user sol 45.148.10.240 port 36048 [preauth]
Feb 23 10:37:26 compute-0 python3.9[71754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:26 compute-0 sudo[71751]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:26 compute-0 sshd-session[71624]: Invalid user sol from 45.148.10.240 port 36062
Feb 23 10:37:26 compute-0 sudo[71875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnegrajesobelvgxxmtcaljsrqlmpljw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843046.2508423-525-60252760376790/AnsiballZ_copy.py'
Feb 23 10:37:26 compute-0 sudo[71875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:27 compute-0 python3.9[71878]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843046.2508423-525-60252760376790/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:27 compute-0 sudo[71875]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:27 compute-0 sshd-session[71624]: Connection closed by invalid user sol 45.148.10.240 port 36062 [preauth]
Feb 23 10:37:27 compute-0 sshd-session[71903]: Invalid user solana from 45.148.10.240 port 56650
Feb 23 10:37:28 compute-0 sudo[72030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehpbabgzrpinkexcdomjfivxahdvwhig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843047.6876485-561-72602978710380/AnsiballZ_timezone.py'
Feb 23 10:37:28 compute-0 sudo[72030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:28 compute-0 sshd-session[71903]: Connection closed by invalid user solana 45.148.10.240 port 56650 [preauth]
Feb 23 10:37:28 compute-0 python3.9[72033]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 23 10:37:28 compute-0 systemd[1]: Starting Time & Date Service...
Feb 23 10:37:28 compute-0 systemd[1]: Started Time & Date Service.
Feb 23 10:37:28 compute-0 sudo[72030]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:29 compute-0 sudo[72187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mocieiohmziybddelehialoagmxabksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843048.8498256-579-124608661946581/AnsiballZ_file.py'
Feb 23 10:37:29 compute-0 sudo[72187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:29 compute-0 python3.9[72190]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:29 compute-0 sudo[72187]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:29 compute-0 sshd-session[72191]: Invalid user solana from 45.148.10.240 port 56666
Feb 23 10:37:29 compute-0 sshd-session[72191]: Connection closed by invalid user solana 45.148.10.240 port 56666 [preauth]
Feb 23 10:37:29 compute-0 sudo[72342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwklhaebvayhxjdkzxjulkfdchqmzbhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843049.5771863-595-10374035617639/AnsiballZ_stat.py'
Feb 23 10:37:29 compute-0 sudo[72342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:30 compute-0 python3.9[72345]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:30 compute-0 sudo[72342]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:30 compute-0 sudo[72466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igfbzowudsjowvbrhkufgtsxrmwixdjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843049.5771863-595-10374035617639/AnsiballZ_copy.py'
Feb 23 10:37:30 compute-0 sudo[72466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:30 compute-0 python3.9[72469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843049.5771863-595-10374035617639/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:30 compute-0 sudo[72466]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:31 compute-0 sudo[72621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gclrvlmezsfsfyhevurwuqeomijlbsfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843050.9442163-625-245344186375033/AnsiballZ_stat.py'
Feb 23 10:37:31 compute-0 sudo[72621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:31 compute-0 python3.9[72624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:31 compute-0 sudo[72621]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:31 compute-0 sshd-session[72494]: Invalid user solana from 45.148.10.240 port 56680
Feb 23 10:37:31 compute-0 sshd-session[72494]: Connection closed by invalid user solana 45.148.10.240 port 56680 [preauth]
Feb 23 10:37:31 compute-0 sudo[72745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brlsyiukwwbxkndaojvyqzfgzbjkpinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843050.9442163-625-245344186375033/AnsiballZ_copy.py'
Feb 23 10:37:31 compute-0 sudo[72745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:31 compute-0 python3.9[72748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843050.9442163-625-245344186375033/.source.yaml _original_basename=.wugnm83t follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:31 compute-0 sudo[72745]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:32 compute-0 sudo[72899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjxytedorohstmosjmaxcamusfkvoik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843052.2973597-655-278148780267478/AnsiballZ_stat.py'
Feb 23 10:37:32 compute-0 sudo[72899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:32 compute-0 python3.9[72902]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:32 compute-0 sudo[72899]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:33 compute-0 sudo[73024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftfssnsvdujbevpqtpqzxevbbnplvgkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843052.2973597-655-278148780267478/AnsiballZ_copy.py'
Feb 23 10:37:33 compute-0 sudo[73024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:33 compute-0 python3.9[73027]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843052.2973597-655-278148780267478/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:33 compute-0 sudo[73024]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:33 compute-0 sshd-session[72866]: Invalid user solana from 45.148.10.240 port 56688
Feb 23 10:37:33 compute-0 sshd-session[72866]: Connection closed by invalid user solana 45.148.10.240 port 56688 [preauth]
Feb 23 10:37:33 compute-0 sudo[73177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsavbailkxyurgqoewangjeampjevvua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843053.5563476-685-220357643322162/AnsiballZ_command.py'
Feb 23 10:37:33 compute-0 sudo[73177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:33 compute-0 python3.9[73180]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:37:33 compute-0 sudo[73177]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:34 compute-0 sudo[73333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uopmwusxuqubznhcfmutoyebljxnwqnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843054.2821999-701-103717163432831/AnsiballZ_command.py'
Feb 23 10:37:34 compute-0 sudo[73333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:34 compute-0 python3.9[73336]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:37:34 compute-0 sudo[73333]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:34 compute-0 sshd-session[73211]: Invalid user sol from 45.148.10.240 port 56702
Feb 23 10:37:35 compute-0 sudo[73487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qujkhtsgqrabernjfzxtqxrkjsmrakbh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843054.9096057-717-129956997517925/AnsiballZ_edpm_nftables_from_files.py'
Feb 23 10:37:35 compute-0 sudo[73487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:35 compute-0 sshd-session[73211]: Connection closed by invalid user sol 45.148.10.240 port 56702 [preauth]
Feb 23 10:37:35 compute-0 python3[73490]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 10:37:35 compute-0 sudo[73487]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:35 compute-0 sudo[73642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmokxjbqjnfuueywhyndxavraiacpzdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843055.7265787-733-105776381611408/AnsiballZ_stat.py'
Feb 23 10:37:35 compute-0 sudo[73642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:36 compute-0 python3.9[73645]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:36 compute-0 sudo[73642]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:36 compute-0 sshd-session[73515]: Invalid user sol from 45.148.10.240 port 56708
Feb 23 10:37:36 compute-0 sudo[73766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaytsotrreftlmfyflpbwnxgbexaikcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843055.7265787-733-105776381611408/AnsiballZ_copy.py'
Feb 23 10:37:36 compute-0 sudo[73766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:36 compute-0 sshd-session[73515]: Connection closed by invalid user sol 45.148.10.240 port 56708 [preauth]
Feb 23 10:37:36 compute-0 python3.9[73769]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843055.7265787-733-105776381611408/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:36 compute-0 sudo[73766]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:37 compute-0 sudo[73921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcgmeiqgfozfxicdoztbdxudhkmkxbdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843057.0312715-763-220619199234864/AnsiballZ_stat.py'
Feb 23 10:37:37 compute-0 sudo[73921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:37 compute-0 python3.9[73924]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:37 compute-0 sudo[73921]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:37 compute-0 sudo[74045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sasztkqjjuqzzfuciztiymcvreakpmau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843057.0312715-763-220619199234864/AnsiballZ_copy.py'
Feb 23 10:37:37 compute-0 sudo[74045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:38 compute-0 python3.9[74048]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843057.0312715-763-220619199234864/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:38 compute-0 sudo[74045]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:38 compute-0 sudo[74200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efvrgseszvfspuvmmqzujquisavntdde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843058.3766441-793-205450390536273/AnsiballZ_stat.py'
Feb 23 10:37:38 compute-0 sudo[74200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:38 compute-0 python3.9[74203]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:38 compute-0 sudo[74200]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:39 compute-0 sshd-session[73846]: Invalid user sol from 45.148.10.240 port 60268
Feb 23 10:37:39 compute-0 sshd-session[74148]: Invalid user sol from 45.148.10.240 port 60272
Feb 23 10:37:39 compute-0 sudo[74324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtuedfseqylorvgslmtdmuaavdniveno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843058.3766441-793-205450390536273/AnsiballZ_copy.py'
Feb 23 10:37:39 compute-0 sudo[74324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:39 compute-0 sshd-session[74148]: Connection closed by invalid user sol 45.148.10.240 port 60272 [preauth]
Feb 23 10:37:39 compute-0 python3.9[74327]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843058.3766441-793-205450390536273/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:39 compute-0 sudo[74324]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:39 compute-0 sudo[74477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otcakkdzakgahvqrliqkqxhfxmvqvpgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843059.6943016-823-80807353906016/AnsiballZ_stat.py'
Feb 23 10:37:39 compute-0 sudo[74477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:40 compute-0 python3.9[74480]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:40 compute-0 sudo[74477]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:40 compute-0 sshd-session[73846]: Connection closed by invalid user sol 45.148.10.240 port 60268 [preauth]
Feb 23 10:37:40 compute-0 sudo[74603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfsipczfgylduwvfjeusxlyoabtladcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843059.6943016-823-80807353906016/AnsiballZ_copy.py'
Feb 23 10:37:40 compute-0 sudo[74603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:40 compute-0 python3.9[74606]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843059.6943016-823-80807353906016/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:40 compute-0 sudo[74603]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:40 compute-0 sshd-session[74481]: Invalid user solana from 45.148.10.240 port 60278
Feb 23 10:37:41 compute-0 sshd-session[74481]: Connection closed by invalid user solana 45.148.10.240 port 60278 [preauth]
Feb 23 10:37:41 compute-0 sudo[74756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhigaabpgthslzxpgfozvpimhddufhpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843060.9446852-853-179149672748970/AnsiballZ_stat.py'
Feb 23 10:37:41 compute-0 sudo[74756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:41 compute-0 python3.9[74759]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:37:41 compute-0 sudo[74756]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:41 compute-0 sudo[74884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klmzifsqimbphxjanpehlfktduubpnot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843060.9446852-853-179149672748970/AnsiballZ_copy.py'
Feb 23 10:37:41 compute-0 sudo[74884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:41 compute-0 sshd-session[74856]: Connection closed by authenticating user root 143.198.30.3 port 54498 [preauth]
Feb 23 10:37:41 compute-0 python3.9[74887]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843060.9446852-853-179149672748970/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:42 compute-0 sudo[74884]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:42 compute-0 sudo[75037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzpxhipeacigewswrzsiamhrntazyjhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843062.379857-883-111706259555334/AnsiballZ_file.py'
Feb 23 10:37:42 compute-0 sudo[75037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:42 compute-0 python3.9[75040]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:42 compute-0 sudo[75037]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:42 compute-0 sshd-session[74783]: Invalid user solana from 45.148.10.240 port 60294
Feb 23 10:37:43 compute-0 sshd-session[74783]: Connection closed by invalid user solana 45.148.10.240 port 60294 [preauth]
Feb 23 10:37:43 compute-0 sudo[75190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yehcezanezzmyjqwvgvecurvkbssxorh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843063.1373167-899-205709740720398/AnsiballZ_command.py'
Feb 23 10:37:43 compute-0 sudo[75190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:43 compute-0 python3.9[75193]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:37:43 compute-0 sudo[75190]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:44 compute-0 sudo[75352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llzzanvnuodwtxtjpycwimacwwxxjqrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843063.928756-915-89752562215752/AnsiballZ_blockinfile.py'
Feb 23 10:37:44 compute-0 sudo[75352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:44 compute-0 python3.9[75355]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:44 compute-0 sudo[75352]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:44 compute-0 sshd-session[75225]: Invalid user solana from 45.148.10.240 port 60298
Feb 23 10:37:44 compute-0 sshd-session[75225]: Connection closed by invalid user solana 45.148.10.240 port 60298 [preauth]
Feb 23 10:37:45 compute-0 sshd-session[75381]: Invalid user solana from 45.148.10.240 port 60314
Feb 23 10:37:45 compute-0 sshd-session[75381]: Connection closed by invalid user solana 45.148.10.240 port 60314 [preauth]
Feb 23 10:37:45 compute-0 sudo[75508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rueifpygvvpsuwzuplgpbfllxfzxmltg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843064.9970796-933-92121878103510/AnsiballZ_file.py'
Feb 23 10:37:45 compute-0 sudo[75508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:45 compute-0 python3.9[75511]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:45 compute-0 sudo[75508]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:45 compute-0 sudo[75661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awmxjdlncbhyktyfvjojysvgldskimsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843065.6604748-933-119989906439308/AnsiballZ_file.py'
Feb 23 10:37:45 compute-0 sudo[75661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:46 compute-0 python3.9[75664]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:46 compute-0 sudo[75661]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:46 compute-0 sudo[75816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spwursxgnafkdkivaycpbfrvwkppufhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843066.4838648-963-88562803463515/AnsiballZ_mount.py'
Feb 23 10:37:46 compute-0 sudo[75816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:47 compute-0 python3.9[75819]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 23 10:37:47 compute-0 sudo[75816]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:47 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:37:47 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:37:47 compute-0 sshd-session[75741]: Invalid user solana from 45.148.10.240 port 44542
Feb 23 10:37:47 compute-0 sudo[75971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pusjzdslzsiugcxqcaeovtvfbpvsnksw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843067.3779137-963-210036689425769/AnsiballZ_mount.py'
Feb 23 10:37:47 compute-0 sudo[75971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:47 compute-0 python3.9[75974]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 23 10:37:47 compute-0 sudo[75971]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:48 compute-0 sshd-session[75741]: Connection closed by invalid user solana 45.148.10.240 port 44542 [preauth]
Feb 23 10:37:48 compute-0 sshd-session[75975]: Invalid user solana from 45.148.10.240 port 44558
Feb 23 10:37:48 compute-0 sshd-session[76002]: Connection closed by authenticating user root 165.227.79.48 port 46654 [preauth]
Feb 23 10:37:48 compute-0 sshd-session[66657]: Connection closed by 192.168.122.30 port 41164
Feb 23 10:37:48 compute-0 sshd-session[66654]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:37:48 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Feb 23 10:37:48 compute-0 systemd[1]: session-15.scope: Consumed 27.498s CPU time.
Feb 23 10:37:48 compute-0 systemd-logind[808]: Session 15 logged out. Waiting for processes to exit.
Feb 23 10:37:48 compute-0 systemd-logind[808]: Removed session 15.
Feb 23 10:37:48 compute-0 sshd-session[75975]: Connection closed by invalid user solana 45.148.10.240 port 44558 [preauth]
Feb 23 10:37:51 compute-0 sshd-session[76006]: Invalid user sol from 45.148.10.240 port 44580
Feb 23 10:37:52 compute-0 sshd-session[76006]: Connection closed by invalid user sol 45.148.10.240 port 44580 [preauth]
Feb 23 10:37:52 compute-0 sshd-session[76004]: Invalid user sol from 45.148.10.240 port 44570
Feb 23 10:37:52 compute-0 sshd-session[76004]: Connection closed by invalid user sol 45.148.10.240 port 44570 [preauth]
Feb 23 10:37:53 compute-0 sshd-session[76008]: Invalid user sol from 45.148.10.240 port 44582
Feb 23 10:37:53 compute-0 sshd-session[76010]: Accepted publickey for zuul from 192.168.122.30 port 59520 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:37:53 compute-0 systemd-logind[808]: New session 16 of user zuul.
Feb 23 10:37:53 compute-0 systemd[1]: Started Session 16 of User zuul.
Feb 23 10:37:53 compute-0 sshd-session[76010]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:37:53 compute-0 sshd-session[76008]: Connection closed by invalid user sol 45.148.10.240 port 44582 [preauth]
Feb 23 10:37:54 compute-0 sudo[76163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atakpotlgmbbhhxvakbsuefnwkecrekz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843073.7828288-17-105389880640851/AnsiballZ_tempfile.py'
Feb 23 10:37:54 compute-0 sudo[76163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:54 compute-0 python3.9[76166]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 23 10:37:54 compute-0 sudo[76163]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:54 compute-0 sshd-session[76167]: Invalid user sol from 45.148.10.240 port 44584
Feb 23 10:37:55 compute-0 sudo[76318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuvuzjsyvpinkkakcyztntudfkopdpzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843074.659249-41-68566458347294/AnsiballZ_stat.py'
Feb 23 10:37:55 compute-0 sudo[76318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:55 compute-0 python3.9[76321]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:37:55 compute-0 sudo[76318]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:55 compute-0 sshd-session[76167]: Connection closed by invalid user sol 45.148.10.240 port 44584 [preauth]
Feb 23 10:37:56 compute-0 sudo[76471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwuufztrcinmqsdthmncjrxqhocrhptm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843075.5201309-61-123396279739370/AnsiballZ_setup.py'
Feb 23 10:37:56 compute-0 sudo[76471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:56 compute-0 python3.9[76474]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:37:56 compute-0 sudo[76471]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:56 compute-0 sudo[76626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nltjbwllkquaekzxdtdtwcsecnlknecq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843076.5869348-78-83691368977951/AnsiballZ_blockinfile.py'
Feb 23 10:37:56 compute-0 sudo[76626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:57 compute-0 python3.9[76629]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC57tgLrKUPXv0UMpqsm4DMd95qZn/zmOD8aiB7hmd/95U72C4Nn2VN19J2Y2o7kwyF4BaBCW1ws+FTb40k47301EKDAOmxbVbKL6R6UBwnN4jMZxZBGfXYplF1XAQXQL31DNdLqUYFGr09yiEhK+FI9EeAz24haPxG4yaDsgkibqCUpq0W3hrcMMmhEl31hfelHnLrMwx3iN19WkzYFyWmljx7nATMJaPvw3saW6Tri8GhfvgrSk7f6kieExRazjvyFPqLq2PepaekW1YvfgBjzg2SPzjlhfBtS2k2ZgnYWyA6tSbloxzlOVVOsR2HTuMaq+j1v4dKOi0rTXfRhRjF3M1TVNeo8UDXGnwMKsBftLjpdaUQetmnjmsvN0RPwbboP0HunilAIfi/W47fUIiPstGrTez9/cntXfXhqi3/5J37p/2ltJtkNvleYBm3KZIGrgnZv5VdnnZFYp5GYhPrHNbnKw6DtSz7tXJ5VJ9sUQrM2dt+RRnHoGjyNSLKges=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGijbdbk9GU2SH0t8RAsu8Lnxq/FRHhG9Up8gkPbm9T1
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFksIDzTKfB1oNUULwcqQxVxQrdaf1JafANu4xYotyh0hIaME10cgj0oWfEJQUk8yqsk5pjKAhSdZlez0rVWP7E=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDO3oeQ2nRdGvX9eP2WjVXFd7QLf5SJvMwdaQb/iUjTjoJSm/B0MDCfC4KGeSvTs0IoyhU587wA5H3OfK+WhDF0303Kig5f52CBoiR46ZxzTg9o9LOWq+DJY74h1SIny2CIjjRCTzXP7A4dcMWgxWZ/Qq4IMZwPtmZhtMGKKfed6evgv+XcMp3SiygDVyVrmozjizQOMSYyCKXbIGEauWMr1QeIhdgXmmPmFQCRHj1aY6JJzgqjIqi+fPbuSRIPfH/4XB6YY6MMS7ttt0+0pZFnE4BQXkwbGH+3xGDWXPtLioRHLvaKW8H7qLgqawuCg9z4adHtAd/DIFt01u7pv1NNVCf2sNq9eG3xQWCrA1IZazf0Idj+8QoC0DrVTYGUZim5dwrLA+AuhF4AGWwMnOjtcrINsoFI/Qpmm4yfpNj2a1pR66ABrEjFd0w7GzKpj8LZfW5GzPUD0+7Qmoo4lEYa2kT3kkidjvANXa1QBKJF1Rs32JBAO1b+AOQ+xGIcUis=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILkv7zPKavEAKA0kpaQdMqL8aR1FjesOSVzQqa5pfmyP
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNfnVRNEtPv7ZEz5+V6cIJgP0Yun549U1rFx6P79OsM7WABbZOhyy222zcWtWuapzun/mlItSKOsQ9mXgGN3lD8=
                                             create=True mode=0644 path=/tmp/ansible.0z0fjhp8 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:57 compute-0 sudo[76626]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:57 compute-0 sshd-session[76475]: Invalid user sol from 45.148.10.240 port 44586
Feb 23 10:37:57 compute-0 sudo[76781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyawtpppvjraldfkylgfyegbygpccvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843077.3132906-94-98748690770044/AnsiballZ_command.py'
Feb 23 10:37:57 compute-0 sudo[76781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:57 compute-0 sshd-session[76475]: Connection closed by invalid user sol 45.148.10.240 port 44586 [preauth]
Feb 23 10:37:57 compute-0 python3.9[76784]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.0z0fjhp8' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:37:57 compute-0 sudo[76781]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:58 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 23 10:37:58 compute-0 sshd-session[76706]: Invalid user sol from 45.148.10.240 port 34492
Feb 23 10:37:58 compute-0 sudo[76938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atwcaidpnntwbfyjyscxnmhzrwwqojjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843078.0654075-110-213404447460334/AnsiballZ_file.py'
Feb 23 10:37:58 compute-0 sudo[76938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:37:58 compute-0 python3.9[76941]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.0z0fjhp8 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:37:58 compute-0 sudo[76938]: pam_unix(sudo:session): session closed for user root
Feb 23 10:37:58 compute-0 sshd-session[76706]: Connection closed by invalid user sol 45.148.10.240 port 34492 [preauth]
Feb 23 10:37:59 compute-0 sshd-session[76013]: Connection closed by 192.168.122.30 port 59520
Feb 23 10:37:59 compute-0 sshd-session[76010]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:37:59 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Feb 23 10:37:59 compute-0 systemd[1]: session-16.scope: Consumed 2.572s CPU time.
Feb 23 10:37:59 compute-0 systemd-logind[808]: Session 16 logged out. Waiting for processes to exit.
Feb 23 10:37:59 compute-0 systemd-logind[808]: Removed session 16.
Feb 23 10:38:01 compute-0 sshd-session[76966]: Invalid user sol from 45.148.10.240 port 34494
Feb 23 10:38:01 compute-0 sshd-session[76967]: Invalid user sol from 45.148.10.240 port 34500
Feb 23 10:38:01 compute-0 sshd-session[76966]: Connection closed by invalid user sol 45.148.10.240 port 34494 [preauth]
Feb 23 10:38:01 compute-0 sshd-session[76967]: Connection closed by invalid user sol 45.148.10.240 port 34500 [preauth]
Feb 23 10:38:04 compute-0 sshd-session[76970]: Invalid user sol from 45.148.10.240 port 34504
Feb 23 10:38:04 compute-0 sshd-session[76974]: Accepted publickey for zuul from 192.168.122.30 port 42768 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:38:04 compute-0 systemd-logind[808]: New session 17 of user zuul.
Feb 23 10:38:04 compute-0 systemd[1]: Started Session 17 of User zuul.
Feb 23 10:38:04 compute-0 sshd-session[76974]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:38:04 compute-0 sshd-session[76970]: Connection closed by invalid user sol 45.148.10.240 port 34504 [preauth]
Feb 23 10:38:04 compute-0 sshd-session[76972]: Invalid user sol from 45.148.10.240 port 34510
Feb 23 10:38:04 compute-0 sshd-session[76972]: Connection closed by invalid user sol 45.148.10.240 port 34510 [preauth]
Feb 23 10:38:05 compute-0 python3.9[77127]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:38:06 compute-0 sshd-session[77128]: Invalid user sol from 45.148.10.240 port 34526
Feb 23 10:38:06 compute-0 sshd-session[77128]: Connection closed by invalid user sol 45.148.10.240 port 34526 [preauth]
Feb 23 10:38:06 compute-0 sudo[77283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuapxfmpcbhnsyyocdmyypmhclbgygkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843085.92334-39-222484172268583/AnsiballZ_systemd.py'
Feb 23 10:38:06 compute-0 sudo[77283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:06 compute-0 python3.9[77286]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 10:38:06 compute-0 sudo[77283]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:07 compute-0 sudo[77440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwdkoxrpzlvekbwjzlqbglichzjxqpyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843086.9561126-55-50001554343322/AnsiballZ_systemd.py'
Feb 23 10:38:07 compute-0 sudo[77440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:07 compute-0 python3.9[77443]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:38:07 compute-0 sudo[77440]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:08 compute-0 sshd-session[77287]: Invalid user sol from 45.148.10.240 port 55164
Feb 23 10:38:08 compute-0 sudo[77594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sprdmbnjcujwtvernvbmmupjkdsonuey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843087.894101-73-24093577278366/AnsiballZ_command.py'
Feb 23 10:38:08 compute-0 sudo[77594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:08 compute-0 sshd-session[77287]: Connection closed by invalid user sol 45.148.10.240 port 55164 [preauth]
Feb 23 10:38:08 compute-0 python3.9[77597]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:38:08 compute-0 sudo[77594]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:09 compute-0 sudo[77750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liuflrhxqotdbmmwahzjfwugswikovzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843088.7324328-89-268446516693950/AnsiballZ_stat.py'
Feb 23 10:38:09 compute-0 sudo[77750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:09 compute-0 sshd-session[77598]: Invalid user sol from 45.148.10.240 port 55172
Feb 23 10:38:09 compute-0 python3.9[77753]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:38:09 compute-0 sudo[77750]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:09 compute-0 sshd-session[77598]: Connection closed by invalid user sol 45.148.10.240 port 55172 [preauth]
Feb 23 10:38:09 compute-0 sudo[77907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekdbksxhxblnmzkgrbdgfscgnahlxbdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843089.5742958-105-170207931266908/AnsiballZ_command.py'
Feb 23 10:38:09 compute-0 sudo[77907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:09 compute-0 python3.9[77910]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:38:10 compute-0 sudo[77907]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:10 compute-0 sudo[78063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpxrzdqgmjpetzxcwmhjnsdziywcgnkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843090.2603204-121-75585325581277/AnsiballZ_file.py'
Feb 23 10:38:10 compute-0 sudo[78063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:10 compute-0 python3.9[78066]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:10 compute-0 sudo[78063]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:10 compute-0 sshd-session[77832]: Invalid user sol from 45.148.10.240 port 55180
Feb 23 10:38:11 compute-0 sshd-session[77832]: Connection closed by invalid user sol 45.148.10.240 port 55180 [preauth]
Feb 23 10:38:11 compute-0 sshd-session[76977]: Connection closed by 192.168.122.30 port 42768
Feb 23 10:38:11 compute-0 sshd-session[76974]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:38:11 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Feb 23 10:38:11 compute-0 systemd[1]: session-17.scope: Consumed 3.434s CPU time.
Feb 23 10:38:11 compute-0 systemd-logind[808]: Session 17 logged out. Waiting for processes to exit.
Feb 23 10:38:11 compute-0 systemd-logind[808]: Removed session 17.
Feb 23 10:38:12 compute-0 sshd-session[78092]: Invalid user sol from 45.148.10.240 port 55186
Feb 23 10:38:12 compute-0 sshd-session[78092]: Connection closed by invalid user sol 45.148.10.240 port 55186 [preauth]
Feb 23 10:38:13 compute-0 sshd-session[78094]: Invalid user sol from 45.148.10.240 port 55194
Feb 23 10:38:13 compute-0 sshd-session[78094]: Connection closed by invalid user sol 45.148.10.240 port 55194 [preauth]
Feb 23 10:38:14 compute-0 sshd-session[78096]: Connection closed by authenticating user root 143.198.30.3 port 51398 [preauth]
Feb 23 10:38:15 compute-0 sshd-session[78098]: Invalid user funded from 45.148.10.240 port 55200
Feb 23 10:38:15 compute-0 sshd-session[78098]: Connection closed by invalid user funded 45.148.10.240 port 55200 [preauth]
Feb 23 10:38:16 compute-0 sshd-session[78100]: Invalid user funded from 45.148.10.240 port 55216
Feb 23 10:38:16 compute-0 sshd-session[78100]: Connection closed by invalid user funded 45.148.10.240 port 55216 [preauth]
Feb 23 10:38:16 compute-0 sshd-session[78102]: Accepted publickey for zuul from 192.168.122.30 port 36402 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:38:16 compute-0 systemd-logind[808]: New session 18 of user zuul.
Feb 23 10:38:16 compute-0 systemd[1]: Started Session 18 of User zuul.
Feb 23 10:38:16 compute-0 sshd-session[78102]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:38:17 compute-0 python3.9[78255]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:38:18 compute-0 sshd-session[78256]: Invalid user sol from 45.148.10.240 port 45206
Feb 23 10:38:18 compute-0 sshd-session[78256]: Connection closed by invalid user sol 45.148.10.240 port 45206 [preauth]
Feb 23 10:38:18 compute-0 sudo[78411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqrdxfgdvxcqfaeeiwcrkacntgdpegkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843098.1377547-43-123498469809599/AnsiballZ_setup.py'
Feb 23 10:38:18 compute-0 sudo[78411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:18 compute-0 python3.9[78414]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:38:18 compute-0 sudo[78411]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:19 compute-0 sudo[78498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eclaiqszzloagejdgdztkiekykdzbiuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843098.1377547-43-123498469809599/AnsiballZ_dnf.py'
Feb 23 10:38:19 compute-0 sudo[78498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:19 compute-0 sshd-session[78415]: Invalid user sol from 45.148.10.240 port 45220
Feb 23 10:38:19 compute-0 python3.9[78501]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 10:38:19 compute-0 sshd-session[78415]: Connection closed by invalid user sol 45.148.10.240 port 45220 [preauth]
Feb 23 10:38:20 compute-0 sshd-session[78503]: Invalid user sol from 45.148.10.240 port 45234
Feb 23 10:38:21 compute-0 sudo[78498]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:21 compute-0 sshd-session[78503]: Connection closed by invalid user sol 45.148.10.240 port 45234 [preauth]
Feb 23 10:38:21 compute-0 python3.9[78655]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:38:22 compute-0 sshd-session[78628]: Invalid user sol from 45.148.10.240 port 45246
Feb 23 10:38:22 compute-0 sshd-session[78628]: Connection closed by invalid user sol 45.148.10.240 port 45246 [preauth]
Feb 23 10:38:23 compute-0 python3.9[78807]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 10:38:23 compute-0 python3.9[78957]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:38:24 compute-0 python3.9[79109]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:38:24 compute-0 sshd-session[78958]: Invalid user sol from 45.148.10.240 port 45258
Feb 23 10:38:24 compute-0 sshd-session[78958]: Connection closed by invalid user sol 45.148.10.240 port 45258 [preauth]
Feb 23 10:38:25 compute-0 sshd-session[78105]: Connection closed by 192.168.122.30 port 36402
Feb 23 10:38:25 compute-0 sshd-session[78102]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:38:25 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Feb 23 10:38:25 compute-0 systemd[1]: session-18.scope: Consumed 5.293s CPU time.
Feb 23 10:38:25 compute-0 systemd-logind[808]: Session 18 logged out. Waiting for processes to exit.
Feb 23 10:38:25 compute-0 systemd-logind[808]: Removed session 18.
Feb 23 10:38:26 compute-0 sshd-session[79135]: Invalid user sol from 45.148.10.240 port 45260
Feb 23 10:38:26 compute-0 sshd-session[79135]: Connection closed by invalid user sol 45.148.10.240 port 45260 [preauth]
Feb 23 10:38:27 compute-0 sshd-session[79137]: Invalid user sol from 45.148.10.240 port 60086
Feb 23 10:38:27 compute-0 sshd-session[79137]: Connection closed by invalid user sol 45.148.10.240 port 60086 [preauth]
Feb 23 10:38:29 compute-0 sshd-session[79139]: Invalid user sol from 45.148.10.240 port 60100
Feb 23 10:38:29 compute-0 sshd-session[79139]: Connection closed by invalid user sol 45.148.10.240 port 60100 [preauth]
Feb 23 10:38:30 compute-0 sshd-session[79141]: Invalid user sol from 45.148.10.240 port 60116
Feb 23 10:38:30 compute-0 sshd-session[79141]: Connection closed by invalid user sol 45.148.10.240 port 60116 [preauth]
Feb 23 10:38:30 compute-0 sshd-session[79143]: Accepted publickey for zuul from 192.168.122.30 port 41670 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:38:30 compute-0 systemd-logind[808]: New session 19 of user zuul.
Feb 23 10:38:30 compute-0 systemd[1]: Started Session 19 of User zuul.
Feb 23 10:38:30 compute-0 sshd-session[79143]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:38:31 compute-0 python3.9[79298]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:38:32 compute-0 sshd-session[79246]: Invalid user sol from 45.148.10.240 port 60126
Feb 23 10:38:32 compute-0 sshd-session[79246]: Connection closed by invalid user sol 45.148.10.240 port 60126 [preauth]
Feb 23 10:38:33 compute-0 sudo[79454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glzuollkhsmhgcgjlwoofjkjhxardnrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843113.12942-75-197238454533740/AnsiballZ_file.py'
Feb 23 10:38:33 compute-0 sudo[79454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:33 compute-0 python3.9[79457]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:33 compute-0 sudo[79454]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:33 compute-0 sshd-session[79327]: Invalid user sol from 45.148.10.240 port 60138
Feb 23 10:38:33 compute-0 sshd-session[79327]: Connection closed by invalid user sol 45.148.10.240 port 60138 [preauth]
Feb 23 10:38:34 compute-0 sudo[79607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwnidlbtczjrxorjvetrsjpdwkcsphsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843113.8241785-75-154161422833687/AnsiballZ_file.py'
Feb 23 10:38:34 compute-0 sudo[79607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:34 compute-0 python3.9[79610]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:34 compute-0 sudo[79607]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:34 compute-0 sudo[79762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adeehlvonufwnczxwusmcvihkulqsjhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843114.4144373-104-125782476434829/AnsiballZ_stat.py'
Feb 23 10:38:34 compute-0 sudo[79762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:34 compute-0 python3.9[79765]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:34 compute-0 sudo[79762]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:35 compute-0 sshd-session[79687]: Invalid user sol from 45.148.10.240 port 60146
Feb 23 10:38:35 compute-0 sshd-session[79687]: Connection closed by invalid user sol 45.148.10.240 port 60146 [preauth]
Feb 23 10:38:35 compute-0 sudo[79886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxpjnhdjlkuveieghzssqsnqupcwawir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843114.4144373-104-125782476434829/AnsiballZ_copy.py'
Feb 23 10:38:35 compute-0 sudo[79886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:35 compute-0 python3.9[79889]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843114.4144373-104-125782476434829/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e3ae2d0ec36302e64cb44d19a0901e7d2f492740 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:35 compute-0 sudo[79886]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:35 compute-0 sshd-session[79914]: Connection closed by authenticating user root 165.227.79.48 port 52390 [preauth]
Feb 23 10:38:35 compute-0 sudo[80041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilyjymbdtlppmjmiuhypggpclidrsrdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843115.6829991-104-9828003960436/AnsiballZ_stat.py'
Feb 23 10:38:35 compute-0 sudo[80041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:36 compute-0 python3.9[80045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:36 compute-0 sudo[80041]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:36 compute-0 sudo[80167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njtsccudhkottxpielpzuzypilxcwidr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843115.6829991-104-9828003960436/AnsiballZ_copy.py'
Feb 23 10:38:36 compute-0 sudo[80167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:36 compute-0 python3.9[80170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843115.6829991-104-9828003960436/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=456fe982b95f9e62b29ffd85c2c457f6d674f8cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:36 compute-0 sudo[80167]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:36 compute-0 sshd-session[80043]: Invalid user sol from 45.148.10.240 port 60160
Feb 23 10:38:36 compute-0 sshd-session[80043]: Connection closed by invalid user sol 45.148.10.240 port 60160 [preauth]
Feb 23 10:38:36 compute-0 sudo[80320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcwyvbrdotelqnhpihpsbbkqpuqpzwsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843116.7498512-104-6212127303416/AnsiballZ_stat.py'
Feb 23 10:38:36 compute-0 sudo[80320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:37 compute-0 python3.9[80323]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:37 compute-0 sudo[80320]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:37 compute-0 sudo[80444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upaacfqeuozuoyxvvzjwkvdqjxecqaps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843116.7498512-104-6212127303416/AnsiballZ_copy.py'
Feb 23 10:38:37 compute-0 sudo[80444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:37 compute-0 python3.9[80447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843116.7498512-104-6212127303416/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=66675a9a8a04ebddc2794bf133374f973aa6be21 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:37 compute-0 sudo[80444]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:38 compute-0 sudo[80599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiczqnawzzpsfspppidmjstvbvlwjkoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843117.8471634-184-91972542306287/AnsiballZ_file.py'
Feb 23 10:38:38 compute-0 sudo[80599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:38 compute-0 sshd-session[80448]: Invalid user sol from 45.148.10.240 port 41270
Feb 23 10:38:38 compute-0 sshd-session[80448]: Connection closed by invalid user sol 45.148.10.240 port 41270 [preauth]
Feb 23 10:38:38 compute-0 python3.9[80602]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:38 compute-0 sudo[80599]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:38 compute-0 sudo[80752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkbteouozpwhjsnxgntiujwcquouesir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843118.3756552-184-147093645688277/AnsiballZ_file.py'
Feb 23 10:38:38 compute-0 sudo[80752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:38 compute-0 python3.9[80755]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:38 compute-0 sudo[80752]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:39 compute-0 sudo[80907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svrsxpqnbrapwlmpvwemctoiyeqxsnhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843118.947827-214-30952648356502/AnsiballZ_stat.py'
Feb 23 10:38:39 compute-0 sudo[80907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:39 compute-0 python3.9[80910]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:39 compute-0 sudo[80907]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:39 compute-0 sshd-session[80855]: Invalid user sol from 45.148.10.240 port 41274
Feb 23 10:38:39 compute-0 sudo[81031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsswdymzgmhyvwgfbtpeilxhxkhizids ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843118.947827-214-30952648356502/AnsiballZ_copy.py'
Feb 23 10:38:39 compute-0 sudo[81031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:39 compute-0 sshd-session[80855]: Connection closed by invalid user sol 45.148.10.240 port 41274 [preauth]
Feb 23 10:38:39 compute-0 python3.9[81034]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843118.947827-214-30952648356502/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e5102ae3e2129183958944961d227cb0121cd1f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:39 compute-0 sudo[81031]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:40 compute-0 sudo[81184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bolwuidkirdhsphttclkwacslksvegob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843119.8979533-214-68146107795680/AnsiballZ_stat.py'
Feb 23 10:38:40 compute-0 sudo[81184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:40 compute-0 python3.9[81187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:40 compute-0 sudo[81184]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:40 compute-0 sudo[81308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpkkswdfrvdlaimmwkjcdmdtqhqklzsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843119.8979533-214-68146107795680/AnsiballZ_copy.py'
Feb 23 10:38:40 compute-0 sudo[81308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:40 compute-0 python3.9[81311]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843119.8979533-214-68146107795680/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=4f02a94b1ea370b83dd291a6ddad114890f097f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:40 compute-0 sudo[81308]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:41 compute-0 sudo[81463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owsluctbczkcrdvzhgihckydwqkrziab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843120.9739897-214-261450117417583/AnsiballZ_stat.py'
Feb 23 10:38:41 compute-0 sudo[81463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:41 compute-0 python3.9[81466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:41 compute-0 sudo[81463]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:41 compute-0 sudo[81587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgosxviavdkjrvtmorlfjtwmsocbalrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843120.9739897-214-261450117417583/AnsiballZ_copy.py'
Feb 23 10:38:41 compute-0 sudo[81587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:41 compute-0 python3.9[81590]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843120.9739897-214-261450117417583/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=81eba7db29717506341d183e5c76866eff700b8f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:41 compute-0 sudo[81587]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:42 compute-0 sudo[81742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjguewkqtcorxumpqluyyuagxhbcbyax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843122.087396-293-116292228553804/AnsiballZ_file.py'
Feb 23 10:38:42 compute-0 sudo[81742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:42 compute-0 sshd-session[81411]: Invalid user sol from 45.148.10.240 port 41282
Feb 23 10:38:42 compute-0 sshd-session[81411]: Connection closed by invalid user sol 45.148.10.240 port 41282 [preauth]
Feb 23 10:38:42 compute-0 python3.9[81745]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:42 compute-0 sudo[81742]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:42 compute-0 sudo[81895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxyxaowwadbsbvfdrptjdgeljzggwreg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843122.6009228-293-84295152622077/AnsiballZ_file.py'
Feb 23 10:38:42 compute-0 sudo[81895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:42 compute-0 sshd-session[81615]: Invalid user sol from 45.148.10.240 port 41284
Feb 23 10:38:43 compute-0 python3.9[81898]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:43 compute-0 sudo[81895]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:43 compute-0 sshd-session[81615]: Connection closed by invalid user sol 45.148.10.240 port 41284 [preauth]
Feb 23 10:38:43 compute-0 sudo[82048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdyimtrzdblyovkfcebxmslpnqhcqwol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843123.237449-319-62707166999340/AnsiballZ_stat.py'
Feb 23 10:38:43 compute-0 sudo[82048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:43 compute-0 python3.9[82051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:43 compute-0 sudo[82048]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:44 compute-0 sudo[82173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbfvzbnjpjcohnlukseghkxpzejlxfqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843123.237449-319-62707166999340/AnsiballZ_copy.py'
Feb 23 10:38:44 compute-0 sudo[82173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:44 compute-0 python3.9[82176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843123.237449-319-62707166999340/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=033a77852ddf7e64ae1ffc4fc6ec9f3b329b751f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:44 compute-0 sudo[82173]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:44 compute-0 sudo[82327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llrsojsqulhzugmdeppvajuxsxynirie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843124.4747481-319-241591414979680/AnsiballZ_stat.py'
Feb 23 10:38:44 compute-0 sudo[82327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:44 compute-0 sshd-session[82146]: Invalid user sol from 45.148.10.240 port 41296
Feb 23 10:38:44 compute-0 python3.9[82330]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:44 compute-0 sudo[82327]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:45 compute-0 sshd-session[82146]: Connection closed by invalid user sol 45.148.10.240 port 41296 [preauth]
Feb 23 10:38:45 compute-0 sudo[82453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubywtvuuiidwcplrzjofardrxftdoptk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843124.4747481-319-241591414979680/AnsiballZ_copy.py'
Feb 23 10:38:45 compute-0 sudo[82453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:45 compute-0 python3.9[82456]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843124.4747481-319-241591414979680/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c2159d66b73798f4e9f98a51bc989f49958d95a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:45 compute-0 sudo[82453]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:45 compute-0 sshd-session[82331]: Invalid user sol from 45.148.10.240 port 41312
Feb 23 10:38:45 compute-0 sudo[82606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acuoohjwshazhckvluaerbxawjkbitgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843125.6751373-319-172681577889367/AnsiballZ_stat.py'
Feb 23 10:38:45 compute-0 sudo[82606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:45 compute-0 sshd-session[82331]: Connection closed by invalid user sol 45.148.10.240 port 41312 [preauth]
Feb 23 10:38:46 compute-0 python3.9[82609]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:46 compute-0 sudo[82606]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:46 compute-0 sudo[82730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-govknfnhmspesicamoscisrjdvjoirpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843125.6751373-319-172681577889367/AnsiballZ_copy.py'
Feb 23 10:38:46 compute-0 sudo[82730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:46 compute-0 python3.9[82733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843125.6751373-319-172681577889367/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=17f3611b66a5ba917b402994c94b55a5cae6b160 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:46 compute-0 sudo[82730]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:47 compute-0 sudo[82883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-berlygpotlwxswfbgcjbpojkqoxclhvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843126.8285227-406-149204584921071/AnsiballZ_file.py'
Feb 23 10:38:47 compute-0 sudo[82883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:47 compute-0 python3.9[82886]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:47 compute-0 sudo[82883]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:47 compute-0 sudo[83037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spfcaxglaqlvtroukrukkmtqmfphprvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843127.3618665-406-175708663504424/AnsiballZ_file.py'
Feb 23 10:38:47 compute-0 sudo[83037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:47 compute-0 python3.9[83040]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:47 compute-0 sudo[83037]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:48 compute-0 sshd-session[83118]: Connection closed by authenticating user root 143.198.30.3 port 37472 [preauth]
Feb 23 10:38:48 compute-0 sudo[83195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xngxchtxtaqdqlcjgsjisetexuthyvkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843127.9608889-435-106164499738241/AnsiballZ_stat.py'
Feb 23 10:38:48 compute-0 sudo[83195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:48 compute-0 python3.9[83198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:48 compute-0 sudo[83195]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:48 compute-0 sudo[83319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzshejybfyndsijvkohcyljkmipbpihu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843127.9608889-435-106164499738241/AnsiballZ_copy.py'
Feb 23 10:38:48 compute-0 sudo[83319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:48 compute-0 sshd-session[82986]: Invalid user sol from 45.148.10.240 port 33914
Feb 23 10:38:48 compute-0 python3.9[83322]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843127.9608889-435-106164499738241/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=95bdc976e9bd2f5682c77326f0c632bbac8ae183 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:48 compute-0 sudo[83319]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:48 compute-0 sshd-session[82986]: Connection closed by invalid user sol 45.148.10.240 port 33914 [preauth]
Feb 23 10:38:48 compute-0 sshd-session[83117]: Invalid user sol from 45.148.10.240 port 33924
Feb 23 10:38:49 compute-0 sshd-session[83117]: Connection closed by invalid user sol 45.148.10.240 port 33924 [preauth]
Feb 23 10:38:49 compute-0 sudo[83472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxkflvyucfwvtfcetevbnrjdhqyfomjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843128.9873204-435-102574512820726/AnsiballZ_stat.py'
Feb 23 10:38:49 compute-0 sudo[83472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:49 compute-0 python3.9[83475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:49 compute-0 sudo[83472]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:49 compute-0 sudo[83596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnqnvmvdxkphnmmnqmsnwuvsuqiqdasr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843128.9873204-435-102574512820726/AnsiballZ_copy.py'
Feb 23 10:38:49 compute-0 sudo[83596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:49 compute-0 python3.9[83599]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843128.9873204-435-102574512820726/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c2159d66b73798f4e9f98a51bc989f49958d95a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:49 compute-0 sudo[83596]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:50 compute-0 sudo[83749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzijkzmxzewspqvptiloxqslmweoqlvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843129.9740644-435-86071898745997/AnsiballZ_stat.py'
Feb 23 10:38:50 compute-0 sudo[83749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:50 compute-0 python3.9[83752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:50 compute-0 sudo[83749]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:50 compute-0 sudo[83875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvhjvnhzpdaiqjurcvqrdevfmmcusxqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843129.9740644-435-86071898745997/AnsiballZ_copy.py'
Feb 23 10:38:50 compute-0 sudo[83875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:50 compute-0 python3.9[83878]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843129.9740644-435-86071898745997/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3090bb61b5a300c1e0afc8def2375554f54d7cf6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:50 compute-0 sudo[83875]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:51 compute-0 sshd-session[83823]: Invalid user sol from 45.148.10.240 port 33934
Feb 23 10:38:51 compute-0 sudo[84030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bosnhnntjuxcltonfhfqithbmzheymwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843131.6743815-538-10342777491521/AnsiballZ_file.py'
Feb 23 10:38:51 compute-0 sudo[84030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:51 compute-0 sshd-session[83903]: Invalid user sol from 45.148.10.240 port 33948
Feb 23 10:38:51 compute-0 sshd-session[83823]: Connection closed by invalid user sol 45.148.10.240 port 33934 [preauth]
Feb 23 10:38:52 compute-0 sshd-session[83903]: Connection closed by invalid user sol 45.148.10.240 port 33948 [preauth]
Feb 23 10:38:52 compute-0 python3.9[84033]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:52 compute-0 sudo[84030]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:52 compute-0 sudo[84183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfycebipzojmhukwghaftimursaitrvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843132.4603517-553-142182311618834/AnsiballZ_stat.py'
Feb 23 10:38:52 compute-0 sudo[84183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:52 compute-0 python3.9[84186]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:52 compute-0 sudo[84183]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:53 compute-0 sudo[84307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keqmfwublvstupovvoiwckoloeydswsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843132.4603517-553-142182311618834/AnsiballZ_copy.py'
Feb 23 10:38:53 compute-0 sudo[84307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:53 compute-0 python3.9[84310]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843132.4603517-553-142182311618834/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:53 compute-0 sudo[84307]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:53 compute-0 sudo[84461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uynioksgttowtomnrumztddobydlymoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843133.500362-578-158613067715054/AnsiballZ_file.py'
Feb 23 10:38:53 compute-0 sudo[84461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:53 compute-0 python3.9[84464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:53 compute-0 sudo[84461]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:54 compute-0 sudo[84617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnsfhxkjxjhwsxybwnumjkyjnbrpncuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843134.0795293-593-51841931393757/AnsiballZ_stat.py'
Feb 23 10:38:54 compute-0 sudo[84617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:54 compute-0 sshd-session[84445]: Invalid user sol from 45.148.10.240 port 33952
Feb 23 10:38:54 compute-0 python3.9[84620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:54 compute-0 sudo[84617]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:54 compute-0 sshd-session[84445]: Connection closed by invalid user sol 45.148.10.240 port 33952 [preauth]
Feb 23 10:38:54 compute-0 sudo[84741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vguajbwlszqatvnxfgyasrvjzxburbqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843134.0795293-593-51841931393757/AnsiballZ_copy.py'
Feb 23 10:38:54 compute-0 sudo[84741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:54 compute-0 sshd-session[84542]: Invalid user sol from 45.148.10.240 port 33958
Feb 23 10:38:55 compute-0 python3.9[84744]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843134.0795293-593-51841931393757/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:55 compute-0 sudo[84741]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:55 compute-0 sudo[84894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxnzccaezgzwcieqpfiegreiviivnpii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843135.2236865-633-105178138435063/AnsiballZ_file.py'
Feb 23 10:38:55 compute-0 sudo[84894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:55 compute-0 sshd-session[84542]: Connection closed by invalid user sol 45.148.10.240 port 33958 [preauth]
Feb 23 10:38:55 compute-0 python3.9[84897]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:55 compute-0 sudo[84894]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:56 compute-0 sudo[85047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzklzmlsjlkibabpxryzkjmrawuzkknu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843135.830816-647-201610582228424/AnsiballZ_stat.py'
Feb 23 10:38:56 compute-0 sudo[85047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:56 compute-0 python3.9[85050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:56 compute-0 sudo[85047]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:56 compute-0 sudo[85171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arqykqedijyckxsktboowzwzwqnmaenk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843135.830816-647-201610582228424/AnsiballZ_copy.py'
Feb 23 10:38:56 compute-0 sudo[85171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:56 compute-0 python3.9[85174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843135.830816-647-201610582228424/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:56 compute-0 sudo[85171]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:57 compute-0 sudo[85328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akbyprnhqcwawsmqchykehyvcwsspcea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843137.0650702-674-40351791501962/AnsiballZ_file.py'
Feb 23 10:38:57 compute-0 sudo[85328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:57 compute-0 python3.9[85331]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:57 compute-0 sudo[85328]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:57 compute-0 sshd-session[85175]: Invalid user sol from 45.148.10.240 port 59618
Feb 23 10:38:57 compute-0 sshd-session[85175]: Connection closed by invalid user sol 45.148.10.240 port 59618 [preauth]
Feb 23 10:38:58 compute-0 sshd-session[85276]: Invalid user sol from 45.148.10.240 port 59646
Feb 23 10:38:58 compute-0 sudo[85481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxcfeazsyycgwgtpbrkpmpxyhkewyzfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843137.9347448-690-93703137679719/AnsiballZ_stat.py'
Feb 23 10:38:58 compute-0 sudo[85481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:58 compute-0 sshd-session[85276]: Connection closed by invalid user sol 45.148.10.240 port 59646 [preauth]
Feb 23 10:38:58 compute-0 python3.9[85484]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:38:58 compute-0 sudo[85481]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:58 compute-0 sudo[85605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brxhqqtrsvhrocijjmahmyvpitabsond ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843137.9347448-690-93703137679719/AnsiballZ_copy.py'
Feb 23 10:38:58 compute-0 sudo[85605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:58 compute-0 python3.9[85608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843137.9347448-690-93703137679719/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:38:58 compute-0 sudo[85605]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:59 compute-0 sudo[85758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifkiawapprseombolqyksrhjjfrxuvop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843139.1798804-721-46715767255077/AnsiballZ_file.py'
Feb 23 10:38:59 compute-0 sudo[85758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:38:59 compute-0 python3.9[85761]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:38:59 compute-0 sudo[85758]: pam_unix(sudo:session): session closed for user root
Feb 23 10:38:59 compute-0 sudo[85911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzwxzgpkgexfzkearczvmeslkvmjjquo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843139.7371578-738-176786374233578/AnsiballZ_stat.py'
Feb 23 10:38:59 compute-0 sudo[85911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:00 compute-0 python3.9[85914]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:00 compute-0 sudo[85911]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:00 compute-0 sudo[86038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcccespybwaobkpdezptkxyyayivllqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843139.7371578-738-176786374233578/AnsiballZ_copy.py'
Feb 23 10:39:00 compute-0 sudo[86038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:00 compute-0 python3.9[86041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843139.7371578-738-176786374233578/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:00 compute-0 sudo[86038]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:00 compute-0 sshd-session[85962]: Invalid user sol from 45.148.10.240 port 59686
Feb 23 10:39:01 compute-0 sudo[86192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgbjtlnrzifxnragiumohgabodtbemjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843140.9129713-769-259607382166772/AnsiballZ_file.py'
Feb 23 10:39:01 compute-0 sudo[86192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:01 compute-0 chronyd[66628]: Selected source 216.232.132.102 (pool.ntp.org)
Feb 23 10:39:01 compute-0 python3.9[86195]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:39:01 compute-0 sudo[86192]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:01 compute-0 sshd-session[85962]: Connection closed by invalid user sol 45.148.10.240 port 59686 [preauth]
Feb 23 10:39:01 compute-0 sudo[86345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzubhonbyhdialyfbkzbhuqjheacvwgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843141.6886327-777-85384849712248/AnsiballZ_stat.py'
Feb 23 10:39:01 compute-0 sudo[86345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:02 compute-0 python3.9[86348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:02 compute-0 sudo[86345]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:02 compute-0 sudo[86469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebsfcaryynpoworjnufvyelvoausqmrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843141.6886327-777-85384849712248/AnsiballZ_copy.py'
Feb 23 10:39:02 compute-0 sudo[86469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:02 compute-0 sshd-session[85986]: Invalid user sol from 45.148.10.240 port 59702
Feb 23 10:39:02 compute-0 python3.9[86472]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843141.6886327-777-85384849712248/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:02 compute-0 sudo[86469]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:02 compute-0 sshd-session[85986]: Connection closed by invalid user sol 45.148.10.240 port 59702 [preauth]
Feb 23 10:39:03 compute-0 sudo[86623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keprytexinhcpbiinqeymczvtxnyyyvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843143.0514493-809-230599332315663/AnsiballZ_file.py'
Feb 23 10:39:03 compute-0 sudo[86623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:03 compute-0 python3.9[86626]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:39:03 compute-0 sudo[86623]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:03 compute-0 sudo[86779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdsrldqgbtbmdtgwkddsskxdurouqcck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843143.670352-823-65677302885832/AnsiballZ_stat.py'
Feb 23 10:39:03 compute-0 sudo[86779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:04 compute-0 sshd-session[86632]: Invalid user user from 45.148.10.240 port 59750
Feb 23 10:39:04 compute-0 python3.9[86782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:04 compute-0 sudo[86779]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:04 compute-0 sshd-session[86632]: Connection closed by invalid user user 45.148.10.240 port 59750 [preauth]
Feb 23 10:39:04 compute-0 sudo[86903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmpgphbnjdazeodvhmguwvgcgukiwrle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843143.670352-823-65677302885832/AnsiballZ_copy.py'
Feb 23 10:39:04 compute-0 sudo[86903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:04 compute-0 python3.9[86906]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843143.670352-823-65677302885832/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f8defe886283cfe041b7389d6c057fd531dc4fb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:04 compute-0 sudo[86903]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:04 compute-0 sshd-session[86578]: Invalid user sol from 45.148.10.240 port 59724
Feb 23 10:39:05 compute-0 sshd-session[86578]: Connection closed by invalid user sol 45.148.10.240 port 59724 [preauth]
Feb 23 10:39:07 compute-0 sshd-session[86931]: Invalid user user from 45.148.10.240 port 33558
Feb 23 10:39:07 compute-0 sshd-session[86931]: Connection closed by invalid user user 45.148.10.240 port 33558 [preauth]
Feb 23 10:39:08 compute-0 sshd-session[86933]: Invalid user solv from 45.148.10.240 port 33566
Feb 23 10:39:08 compute-0 sshd-session[86933]: Connection closed by invalid user solv 45.148.10.240 port 33566 [preauth]
Feb 23 10:39:10 compute-0 sshd-session[86935]: Invalid user solv from 45.148.10.240 port 33572
Feb 23 10:39:10 compute-0 sshd-session[79146]: Connection closed by 192.168.122.30 port 41670
Feb 23 10:39:10 compute-0 sshd-session[79143]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:39:10 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Feb 23 10:39:10 compute-0 systemd[1]: session-19.scope: Consumed 22.396s CPU time.
Feb 23 10:39:10 compute-0 systemd-logind[808]: Session 19 logged out. Waiting for processes to exit.
Feb 23 10:39:10 compute-0 systemd-logind[808]: Removed session 19.
Feb 23 10:39:10 compute-0 sshd-session[86935]: Connection closed by invalid user solv 45.148.10.240 port 33572 [preauth]
Feb 23 10:39:11 compute-0 sshd-session[86937]: Invalid user solv from 45.148.10.240 port 33584
Feb 23 10:39:11 compute-0 sshd-session[86937]: Connection closed by invalid user solv 45.148.10.240 port 33584 [preauth]
Feb 23 10:39:13 compute-0 sshd-session[86941]: Invalid user solv from 45.148.10.240 port 33590
Feb 23 10:39:13 compute-0 sshd-session[86939]: Invalid user solv from 45.148.10.240 port 33588
Feb 23 10:39:13 compute-0 sshd-session[86939]: Connection closed by invalid user solv 45.148.10.240 port 33588 [preauth]
Feb 23 10:39:14 compute-0 sshd-session[86941]: Connection closed by invalid user solv 45.148.10.240 port 33590 [preauth]
Feb 23 10:39:15 compute-0 sshd-session[86943]: Accepted publickey for zuul from 192.168.122.30 port 39396 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:39:15 compute-0 systemd-logind[808]: New session 20 of user zuul.
Feb 23 10:39:15 compute-0 systemd[1]: Started Session 20 of User zuul.
Feb 23 10:39:15 compute-0 sshd-session[86943]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:39:16 compute-0 python3.9[87100]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:39:16 compute-0 sshd-session[86947]: Invalid user solv from 45.148.10.240 port 33594
Feb 23 10:39:16 compute-0 sshd-session[87001]: Invalid user solv from 45.148.10.240 port 33602
Feb 23 10:39:16 compute-0 sshd-session[86947]: Connection closed by invalid user solv 45.148.10.240 port 33594 [preauth]
Feb 23 10:39:16 compute-0 sshd-session[87001]: Connection closed by invalid user solv 45.148.10.240 port 33602 [preauth]
Feb 23 10:39:17 compute-0 sudo[87254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgkfgsfniooxapwhuvjlcfzmsobdthla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843157.0677073-43-155700269682056/AnsiballZ_file.py'
Feb 23 10:39:17 compute-0 sudo[87254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:17 compute-0 python3.9[87257]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:39:17 compute-0 sudo[87254]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:18 compute-0 sudo[87407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxlvppbeyzyrblxleosboxucrqyhhyqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843157.8172426-43-262594134442560/AnsiballZ_file.py'
Feb 23 10:39:18 compute-0 sudo[87407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:18 compute-0 python3.9[87410]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:39:18 compute-0 sudo[87407]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:19 compute-0 python3.9[87564]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:39:19 compute-0 sshd-session[87435]: Invalid user solv from 45.148.10.240 port 56000
Feb 23 10:39:19 compute-0 sshd-session[87435]: Connection closed by invalid user solv 45.148.10.240 port 56000 [preauth]
Feb 23 10:39:19 compute-0 sudo[87714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxaooxswxzxkryorapordxgjqoriovqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843159.521984-89-141735651156662/AnsiballZ_seboolean.py'
Feb 23 10:39:19 compute-0 sudo[87714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:19 compute-0 sshd-session[87511]: Invalid user solv from 45.148.10.240 port 56002
Feb 23 10:39:20 compute-0 python3.9[87717]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 23 10:39:20 compute-0 sshd-session[87511]: Connection closed by invalid user solv 45.148.10.240 port 56002 [preauth]
Feb 23 10:39:20 compute-0 sshd-session[87722]: Connection closed by authenticating user root 143.198.30.3 port 46524 [preauth]
Feb 23 10:39:20 compute-0 sudo[87714]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:21 compute-0 sudo[87876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfwifhrdkdsjwmqcgfzadthdqhwxlsad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843161.5899713-109-242128968365971/AnsiballZ_setup.py'
Feb 23 10:39:21 compute-0 dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 23 10:39:21 compute-0 sudo[87876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:22 compute-0 sshd-session[87794]: Invalid user solv from 45.148.10.240 port 56012
Feb 23 10:39:22 compute-0 python3.9[87879]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:39:22 compute-0 sshd-session[87849]: Invalid user validator from 45.148.10.240 port 56016
Feb 23 10:39:22 compute-0 sshd-session[87794]: Connection closed by invalid user solv 45.148.10.240 port 56012 [preauth]
Feb 23 10:39:22 compute-0 sudo[87876]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:22 compute-0 sshd-session[87849]: Connection closed by invalid user validator 45.148.10.240 port 56016 [preauth]
Feb 23 10:39:22 compute-0 sudo[87962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxnxkuaoonddtfgdraolctydyjjnoith ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843161.5899713-109-242128968365971/AnsiballZ_dnf.py'
Feb 23 10:39:22 compute-0 sudo[87962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:22 compute-0 python3.9[87965]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:39:23 compute-0 sshd-session[87967]: Connection closed by authenticating user root 165.227.79.48 port 59488 [preauth]
Feb 23 10:39:24 compute-0 sudo[87962]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:24 compute-0 sudo[88120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coacdbczqnidqstificjifbsedaqdfqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843164.297282-133-39717444476395/AnsiballZ_systemd.py'
Feb 23 10:39:24 compute-0 sudo[88120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:25 compute-0 python3.9[88123]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:39:25 compute-0 sudo[88120]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:25 compute-0 sshd-session[88068]: Invalid user validator from 45.148.10.240 port 56026
Feb 23 10:39:25 compute-0 sshd-session[88124]: Invalid user solana from 45.148.10.240 port 56032
Feb 23 10:39:25 compute-0 sshd-session[88068]: Connection closed by invalid user validator 45.148.10.240 port 56026 [preauth]
Feb 23 10:39:25 compute-0 sshd-session[88124]: Connection closed by invalid user solana 45.148.10.240 port 56032 [preauth]
Feb 23 10:39:25 compute-0 sudo[88278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mybquppgmifonlvgfjdjtuyszenfoogl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843165.425049-149-187466274647049/AnsiballZ_edpm_nftables_snippet.py'
Feb 23 10:39:25 compute-0 sudo[88278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:25 compute-0 python3[88281]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 23 10:39:26 compute-0 sudo[88278]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:26 compute-0 sudo[88431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxleuhbslbmgmkqyfrmjpwrmgktbpndq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843166.2344499-167-41784893976327/AnsiballZ_file.py'
Feb 23 10:39:26 compute-0 sudo[88431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:26 compute-0 python3.9[88434]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:26 compute-0 sudo[88431]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:27 compute-0 sudo[88584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbcttdgcfnbhevqmgzyzjdcaudftongd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843166.9292037-183-162561126290139/AnsiballZ_stat.py'
Feb 23 10:39:27 compute-0 sudo[88584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:27 compute-0 python3.9[88587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:27 compute-0 sudo[88584]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:27 compute-0 sudo[88665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkwvymdqoplcmjmtrqljuxlgipkkflmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843166.9292037-183-162561126290139/AnsiballZ_file.py'
Feb 23 10:39:27 compute-0 sudo[88665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:27 compute-0 python3.9[88668]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:28 compute-0 sudo[88665]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:28 compute-0 sshd-session[88589]: Invalid user solana from 45.148.10.240 port 60986
Feb 23 10:39:28 compute-0 sshd-session[88589]: Connection closed by invalid user solana 45.148.10.240 port 60986 [preauth]
Feb 23 10:39:28 compute-0 sudo[88820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytpktsmgswhxxoislktixhestinnmwdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843168.2027965-207-278323684560505/AnsiballZ_stat.py'
Feb 23 10:39:28 compute-0 sudo[88820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:28 compute-0 sshd-session[88669]: Invalid user solana from 45.148.10.240 port 60994
Feb 23 10:39:28 compute-0 python3.9[88823]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:28 compute-0 sudo[88820]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:28 compute-0 sshd-session[88669]: Connection closed by invalid user solana 45.148.10.240 port 60994 [preauth]
Feb 23 10:39:28 compute-0 sudo[88899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnlwmgviixvlhtxyltfiztkirwlwpusu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843168.2027965-207-278323684560505/AnsiballZ_file.py'
Feb 23 10:39:28 compute-0 sudo[88899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:29 compute-0 python3.9[88902]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.3ucx6do6 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:29 compute-0 sudo[88899]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:29 compute-0 sudo[89052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbagrxgotojehafcytwfpukcvmrhwxbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843169.2946022-231-211091044238137/AnsiballZ_stat.py'
Feb 23 10:39:29 compute-0 sudo[89052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:29 compute-0 python3.9[89055]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:29 compute-0 sudo[89052]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:30 compute-0 sudo[89131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viiahwserlzotsfnjgclpmrqabctlzun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843169.2946022-231-211091044238137/AnsiballZ_file.py'
Feb 23 10:39:30 compute-0 sudo[89131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:30 compute-0 python3.9[89134]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:30 compute-0 sudo[89131]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:30 compute-0 sudo[89286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okaoikjisvbdmljzqnbyhnemrmziflck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843170.4468684-257-266574922217878/AnsiballZ_command.py'
Feb 23 10:39:30 compute-0 sudo[89286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:31 compute-0 python3.9[89289]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:39:31 compute-0 sudo[89286]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:31 compute-0 sshd-session[89211]: Invalid user solana from 45.148.10.240 port 32772
Feb 23 10:39:31 compute-0 sudo[89442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glackeadhcomzhjjqfcxtiuutdahlukq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843171.2571685-273-166055789919101/AnsiballZ_edpm_nftables_from_files.py'
Feb 23 10:39:31 compute-0 sudo[89442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:31 compute-0 sshd-session[89211]: Connection closed by invalid user solana 45.148.10.240 port 32772 [preauth]
Feb 23 10:39:31 compute-0 python3[89445]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 10:39:31 compute-0 sudo[89442]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:31 compute-0 sshd-session[89290]: Invalid user pbanx from 45.148.10.240 port 32776
Feb 23 10:39:32 compute-0 sshd-session[89290]: Connection closed by invalid user pbanx 45.148.10.240 port 32776 [preauth]
Feb 23 10:39:32 compute-0 sudo[89595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usoonwxhurdwwgdwqkhapblyzcoavwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843172.0873246-289-245457675270477/AnsiballZ_stat.py'
Feb 23 10:39:32 compute-0 sudo[89595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:32 compute-0 python3.9[89598]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:32 compute-0 sudo[89595]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:33 compute-0 sudo[89721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itxpoofyecyjbdicgrafkwzzqqbaizpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843172.0873246-289-245457675270477/AnsiballZ_copy.py'
Feb 23 10:39:33 compute-0 sudo[89721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:33 compute-0 python3.9[89724]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843172.0873246-289-245457675270477/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:33 compute-0 sudo[89721]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:33 compute-0 sudo[89874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqxzstglidudxayvjgjpeiwtxevbyoco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843173.5921707-319-79317961124731/AnsiballZ_stat.py'
Feb 23 10:39:33 compute-0 sudo[89874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:34 compute-0 python3.9[89877]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:34 compute-0 sudo[89874]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:34 compute-0 sudo[90004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptduukirpthycwbzdonojnetydglcggb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843173.5921707-319-79317961124731/AnsiballZ_copy.py'
Feb 23 10:39:34 compute-0 sudo[90004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:34 compute-0 sshd-session[89878]: Invalid user pbanx from 45.148.10.240 port 32790
Feb 23 10:39:34 compute-0 python3.9[90007]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843173.5921707-319-79317961124731/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:34 compute-0 sudo[90004]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:34 compute-0 sshd-session[89882]: Invalid user banxgg from 45.148.10.240 port 32796
Feb 23 10:39:34 compute-0 sshd-session[89882]: Connection closed by invalid user banxgg 45.148.10.240 port 32796 [preauth]
Feb 23 10:39:35 compute-0 sshd-session[89878]: Connection closed by invalid user pbanx 45.148.10.240 port 32790 [preauth]
Feb 23 10:39:35 compute-0 sudo[90157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnoibvrdjighhmzhdzgyplgfvptneinw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843174.940056-349-132751868909658/AnsiballZ_stat.py'
Feb 23 10:39:35 compute-0 sudo[90157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:35 compute-0 python3.9[90160]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:35 compute-0 sudo[90157]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:35 compute-0 sudo[90283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wobkbccjnpwmdxcmppmrutrtotfwwvjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843174.940056-349-132751868909658/AnsiballZ_copy.py'
Feb 23 10:39:35 compute-0 sudo[90283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:35 compute-0 python3.9[90286]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843174.940056-349-132751868909658/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:35 compute-0 sudo[90283]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:36 compute-0 sudo[90436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzkxpyzjwttezpdxkdkclkwnsncvidta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843176.3977342-379-46696116272418/AnsiballZ_stat.py'
Feb 23 10:39:36 compute-0 sudo[90436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:36 compute-0 python3.9[90439]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:36 compute-0 sudo[90436]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:37 compute-0 sudo[90565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyhuxacwtuzjdhoalgokidewcnieoyad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843176.3977342-379-46696116272418/AnsiballZ_copy.py'
Feb 23 10:39:37 compute-0 sudo[90565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:37 compute-0 python3.9[90569]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843176.3977342-379-46696116272418/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:37 compute-0 sudo[90565]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:37 compute-0 sshd-session[90513]: Invalid user banx from 45.148.10.240 port 55022
Feb 23 10:39:38 compute-0 sshd-session[90513]: Connection closed by invalid user banx 45.148.10.240 port 55022 [preauth]
Feb 23 10:39:38 compute-0 sudo[90719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-festwkmhtesqwhhaestftovvliwooqiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843177.8529968-409-117204099224409/AnsiballZ_stat.py'
Feb 23 10:39:38 compute-0 sudo[90719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:38 compute-0 python3.9[90722]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:38 compute-0 sudo[90719]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:38 compute-0 sshd-session[90489]: Invalid user banxgg from 45.148.10.240 port 55010
Feb 23 10:39:38 compute-0 sudo[90845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmvbrailrfpoufnzsdhkbiujcsglmpqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843177.8529968-409-117204099224409/AnsiballZ_copy.py'
Feb 23 10:39:38 compute-0 sudo[90845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:38 compute-0 sshd-session[90489]: Connection closed by invalid user banxgg 45.148.10.240 port 55010 [preauth]
Feb 23 10:39:38 compute-0 python3.9[90848]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843177.8529968-409-117204099224409/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:38 compute-0 sudo[90845]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:39 compute-0 sudo[90998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofxgjldmmrvprhcrwsrufbcwnedyagaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843179.3040133-439-274509872120425/AnsiballZ_file.py'
Feb 23 10:39:39 compute-0 sudo[90998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:39 compute-0 python3.9[91001]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:39 compute-0 sudo[90998]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:40 compute-0 sudo[91151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icjvltaevvvpivzvkqbbaojkpnkwtilr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843179.9920695-455-248804180014665/AnsiballZ_command.py'
Feb 23 10:39:40 compute-0 sudo[91151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:40 compute-0 python3.9[91154]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:39:40 compute-0 sudo[91151]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:40 compute-0 sshd-session[91156]: Invalid user banx from 45.148.10.240 port 55048
Feb 23 10:39:41 compute-0 sudo[91311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toyxngjixkcwxpubjgjohmqamllgumxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843180.7551196-471-134176758777908/AnsiballZ_blockinfile.py'
Feb 23 10:39:41 compute-0 sudo[91311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:41 compute-0 sshd-session[91155]: Connection closed by authenticating user root 45.148.10.240 port 55032 [preauth]
Feb 23 10:39:41 compute-0 python3.9[91314]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:41 compute-0 sudo[91311]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:41 compute-0 sshd-session[91156]: Connection closed by invalid user banx 45.148.10.240 port 55048 [preauth]
Feb 23 10:39:42 compute-0 sudo[91464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwgpmbnzkgsqolkvedzvqpwvezjeuhnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843181.7478986-489-114387834339094/AnsiballZ_command.py'
Feb 23 10:39:42 compute-0 sudo[91464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:42 compute-0 python3.9[91467]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:39:42 compute-0 sudo[91464]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:42 compute-0 sudo[91618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjwtchyrmfkjcdgfpwszgaodqmtykbhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843182.4994724-505-259001473005967/AnsiballZ_stat.py'
Feb 23 10:39:42 compute-0 sudo[91618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:42 compute-0 python3.9[91621]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:39:42 compute-0 sudo[91618]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:43 compute-0 sudo[91775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocytnjxexcnahdvqtakpjettmnlwocri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843183.0952573-521-199692898111098/AnsiballZ_command.py'
Feb 23 10:39:43 compute-0 sudo[91775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:43 compute-0 python3.9[91778]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:39:43 compute-0 sudo[91775]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:44 compute-0 sshd-session[91779]: Invalid user ethereum from 45.148.10.240 port 55070
Feb 23 10:39:44 compute-0 sudo[91933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayfxbnjhuefwrjpamwgokjgchleacgoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843183.8042002-537-15890976285107/AnsiballZ_file.py'
Feb 23 10:39:44 compute-0 sudo[91933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:44 compute-0 python3.9[91936]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:44 compute-0 sshd-session[91692]: Connection closed by authenticating user root 45.148.10.240 port 55060 [preauth]
Feb 23 10:39:44 compute-0 sudo[91933]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:44 compute-0 sshd-session[91779]: Connection closed by invalid user ethereum 45.148.10.240 port 55070 [preauth]
Feb 23 10:39:45 compute-0 python3.9[92086]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:39:46 compute-0 sudo[92237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpyyzfsftahuexcafgmkylukiwaebbsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843186.0975492-619-230249570730646/AnsiballZ_command.py'
Feb 23 10:39:46 compute-0 sudo[92237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:46 compute-0 python3.9[92240]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:2f:db:26:37" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:39:46 compute-0 ovs-vsctl[92241]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:2f:db:26:37 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 23 10:39:46 compute-0 sudo[92237]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:47 compute-0 sshd-session[92268]: Invalid user ethereum from 45.148.10.240 port 35944
Feb 23 10:39:47 compute-0 sudo[92395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajhjenmwijlffsxnnbaihxyxezgolsob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843186.8852508-637-92959858075760/AnsiballZ_command.py'
Feb 23 10:39:47 compute-0 sudo[92395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:47 compute-0 sshd-session[92268]: Connection closed by invalid user ethereum 45.148.10.240 port 35944 [preauth]
Feb 23 10:39:47 compute-0 python3.9[92398]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:39:47 compute-0 sudo[92395]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:47 compute-0 sshd-session[92242]: Invalid user eth from 45.148.10.240 port 35930
Feb 23 10:39:47 compute-0 sshd-session[92242]: Connection closed by invalid user eth 45.148.10.240 port 35930 [preauth]
Feb 23 10:39:47 compute-0 sudo[92551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpuzaurkvfzkfahvlubemaqlmqgeixdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843187.4920723-653-44649225979732/AnsiballZ_command.py'
Feb 23 10:39:47 compute-0 sudo[92551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:47 compute-0 python3.9[92554]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:39:47 compute-0 ovs-vsctl[92555]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 23 10:39:47 compute-0 sudo[92551]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:48 compute-0 python3.9[92705]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:39:49 compute-0 sudo[92857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lestgaaqafcxbqzbqwyeypxaflnrfhbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843188.9037724-687-34069188269920/AnsiballZ_file.py'
Feb 23 10:39:49 compute-0 sudo[92857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:49 compute-0 python3.9[92860]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:39:49 compute-0 sudo[92857]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:49 compute-0 sudo[93014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhrhvhhkvpebzvlkmtmtgpmhdyboikic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843189.7535946-703-25333344535902/AnsiballZ_stat.py'
Feb 23 10:39:49 compute-0 sudo[93014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:50 compute-0 python3.9[93017]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:50 compute-0 sudo[93014]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:50 compute-0 sudo[93093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqrhwuflnohoyxanmrrzjreqftabtjxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843189.7535946-703-25333344535902/AnsiballZ_file.py'
Feb 23 10:39:50 compute-0 sudo[93093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:50 compute-0 sshd-session[92861]: Invalid user eth from 45.148.10.240 port 35948
Feb 23 10:39:50 compute-0 python3.9[93096]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:39:50 compute-0 sudo[93093]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:50 compute-0 sshd-session[92861]: Connection closed by invalid user eth 45.148.10.240 port 35948 [preauth]
Feb 23 10:39:50 compute-0 sshd-session[92943]: Invalid user solv from 45.148.10.240 port 35956
Feb 23 10:39:50 compute-0 sshd-session[92943]: Connection closed by invalid user solv 45.148.10.240 port 35956 [preauth]
Feb 23 10:39:51 compute-0 sudo[93246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utzdcbwbigfxsefjwyjyzwrhwjcaidyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843190.7632785-703-4903811954246/AnsiballZ_stat.py'
Feb 23 10:39:51 compute-0 sudo[93246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:51 compute-0 python3.9[93249]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:51 compute-0 sudo[93246]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:51 compute-0 sudo[93325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvbqlibgiqkjxybijgzjzjjpndfryqdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843190.7632785-703-4903811954246/AnsiballZ_file.py'
Feb 23 10:39:51 compute-0 sudo[93325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:51 compute-0 python3.9[93328]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:39:51 compute-0 sudo[93325]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:52 compute-0 sudo[93478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubgvsuoogfkbuyhyuqdthxbekixowtsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843192.1168065-749-45514827802139/AnsiballZ_file.py'
Feb 23 10:39:52 compute-0 sudo[93478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:52 compute-0 python3.9[93481]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:52 compute-0 sudo[93478]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:53 compute-0 sudo[93635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwoapmzfpzlirullngkxvilpjleorjlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843192.8513262-765-42522946289899/AnsiballZ_stat.py'
Feb 23 10:39:53 compute-0 sudo[93635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:53 compute-0 sshd-session[93531]: Invalid user solv from 45.148.10.240 port 35972
Feb 23 10:39:53 compute-0 sshd-session[93531]: Connection closed by invalid user solv 45.148.10.240 port 35972 [preauth]
Feb 23 10:39:53 compute-0 python3.9[93638]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:53 compute-0 sudo[93635]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:53 compute-0 sshd-session[93506]: Invalid user ubuntu from 45.148.10.240 port 35970
Feb 23 10:39:53 compute-0 sshd-session[93506]: Connection closed by invalid user ubuntu 45.148.10.240 port 35970 [preauth]
Feb 23 10:39:53 compute-0 sudo[93714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shawlkhuixlrqrwehoybswoeaydjbsmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843192.8513262-765-42522946289899/AnsiballZ_file.py'
Feb 23 10:39:53 compute-0 sudo[93714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:53 compute-0 python3.9[93717]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:53 compute-0 sudo[93714]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:54 compute-0 sudo[93867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrgvhfjphhnlzfupzlfzkmkmjgwalxcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843194.010025-789-274870999283100/AnsiballZ_stat.py'
Feb 23 10:39:54 compute-0 sudo[93867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:54 compute-0 python3.9[93870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:54 compute-0 sudo[93867]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:54 compute-0 sudo[93948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pecxgyxqplagqmhjfhegnaemzlwmsibq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843194.010025-789-274870999283100/AnsiballZ_file.py'
Feb 23 10:39:54 compute-0 sudo[93948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:54 compute-0 sshd-session[93896]: Connection closed by authenticating user root 143.198.30.3 port 55274 [preauth]
Feb 23 10:39:54 compute-0 python3.9[93951]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:54 compute-0 sudo[93948]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:55 compute-0 sudo[94101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjynzjuerdbklglfdhlkuslbgbqlcgin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843195.2066798-813-164332327426979/AnsiballZ_systemd.py'
Feb 23 10:39:55 compute-0 sudo[94101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:55 compute-0 python3.9[94104]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:39:55 compute-0 systemd[1]: Reloading.
Feb 23 10:39:55 compute-0 systemd-sysv-generator[94134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:39:55 compute-0 systemd-rc-local-generator[94125]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:39:56 compute-0 sudo[94101]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:56 compute-0 sshd-session[94105]: Invalid user ubuntu from 45.148.10.240 port 35974
Feb 23 10:39:56 compute-0 sudo[94303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqkfhqusufgyjgkclhixttfdnwqiblnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843196.2563539-829-234194550417084/AnsiballZ_stat.py'
Feb 23 10:39:56 compute-0 sudo[94303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:56 compute-0 sshd-session[94105]: Connection closed by invalid user ubuntu 45.148.10.240 port 35974 [preauth]
Feb 23 10:39:56 compute-0 python3.9[94306]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:56 compute-0 sudo[94303]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:56 compute-0 sudo[94382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqvnpkxvxogvmisypteauquxegrjroeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843196.2563539-829-234194550417084/AnsiballZ_file.py'
Feb 23 10:39:56 compute-0 sudo[94382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:56 compute-0 sshd-session[94151]: Invalid user ubuntu from 45.148.10.240 port 35990
Feb 23 10:39:57 compute-0 python3.9[94385]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:57 compute-0 sudo[94382]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:57 compute-0 sshd-session[94151]: Connection closed by invalid user ubuntu 45.148.10.240 port 35990 [preauth]
Feb 23 10:39:57 compute-0 sudo[94535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-depqeevgoaphhxbhcymwsmujghaveopb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843197.4417598-853-99462256367936/AnsiballZ_stat.py'
Feb 23 10:39:57 compute-0 sudo[94535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:57 compute-0 python3.9[94538]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:39:57 compute-0 sudo[94535]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:58 compute-0 sudo[94614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnarixmlwhqfeodpinwvxhboweebxkrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843197.4417598-853-99462256367936/AnsiballZ_file.py'
Feb 23 10:39:58 compute-0 sudo[94614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:58 compute-0 python3.9[94617]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:39:58 compute-0 sudo[94614]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:58 compute-0 sudo[94769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eucmeailqpczvougczdzyvyebpiefqmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843198.547365-877-65758368583783/AnsiballZ_systemd.py'
Feb 23 10:39:58 compute-0 sudo[94769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:39:59 compute-0 python3.9[94772]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:39:59 compute-0 systemd[1]: Reloading.
Feb 23 10:39:59 compute-0 systemd-sysv-generator[94805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:39:59 compute-0 systemd-rc-local-generator[94800]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:39:59 compute-0 sshd-session[94694]: Invalid user ubuntu from 45.148.10.240 port 59018
Feb 23 10:39:59 compute-0 systemd[1]: Starting Create netns directory...
Feb 23 10:39:59 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 10:39:59 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 10:39:59 compute-0 systemd[1]: Finished Create netns directory.
Feb 23 10:39:59 compute-0 sudo[94769]: pam_unix(sudo:session): session closed for user root
Feb 23 10:39:59 compute-0 sshd-session[94694]: Connection closed by invalid user ubuntu 45.148.10.240 port 59018 [preauth]
Feb 23 10:40:00 compute-0 sudo[94972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puwccsnsuodiibholqtynrdaarjcrzwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843199.8138378-897-196222025878388/AnsiballZ_file.py'
Feb 23 10:40:00 compute-0 sudo[94972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:00 compute-0 python3.9[94975]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:00 compute-0 sudo[94972]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:00 compute-0 sshd-session[94846]: Invalid user ubuntu from 45.148.10.240 port 59030
Feb 23 10:40:00 compute-0 sshd-session[94846]: Connection closed by invalid user ubuntu 45.148.10.240 port 59030 [preauth]
Feb 23 10:40:00 compute-0 sudo[95126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgatosicntkclpkentdhxlxbzdsidxgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843200.4415255-913-13535319583450/AnsiballZ_stat.py'
Feb 23 10:40:00 compute-0 sudo[95126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:00 compute-0 python3.9[95129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:00 compute-0 sudo[95126]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:01 compute-0 sudo[95250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsqnoyiafxsxcszvsyvjjhylivvmcnjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843200.4415255-913-13535319583450/AnsiballZ_copy.py'
Feb 23 10:40:01 compute-0 sudo[95250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:01 compute-0 python3.9[95253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843200.4415255-913-13535319583450/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:01 compute-0 sudo[95250]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:02 compute-0 sudo[95405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcpplolukqvqktuoduhkunmwmmlzvhnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843201.7749577-947-168508054698312/AnsiballZ_file.py'
Feb 23 10:40:02 compute-0 sudo[95405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:02 compute-0 sshd-session[95278]: Invalid user ubuntu from 45.148.10.240 port 59032
Feb 23 10:40:02 compute-0 python3.9[95408]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:02 compute-0 sudo[95405]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:02 compute-0 sshd-session[95278]: Connection closed by invalid user ubuntu 45.148.10.240 port 59032 [preauth]
Feb 23 10:40:02 compute-0 sudo[95558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzslwkuxynresqilkubsnwiankxdqgum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843202.4065843-963-202262895901031/AnsiballZ_file.py'
Feb 23 10:40:02 compute-0 sudo[95558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:02 compute-0 python3.9[95561]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:02 compute-0 sudo[95558]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:03 compute-0 sshd-session[95562]: Invalid user ubuntu from 45.148.10.240 port 59034
Feb 23 10:40:03 compute-0 sudo[95713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khisinzsgqnunbitnrfxotrftmdukasx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843203.0846975-979-172266532448099/AnsiballZ_stat.py'
Feb 23 10:40:03 compute-0 sudo[95713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:03 compute-0 python3.9[95716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:03 compute-0 sudo[95713]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:03 compute-0 sudo[95837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffvpbmklbxoivsvdxhzkrvlzpvgewlfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843203.0846975-979-172266532448099/AnsiballZ_copy.py'
Feb 23 10:40:03 compute-0 sudo[95837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:04 compute-0 sshd-session[95562]: Connection closed by invalid user ubuntu 45.148.10.240 port 59034 [preauth]
Feb 23 10:40:04 compute-0 python3.9[95840]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843203.0846975-979-172266532448099/.source.json _original_basename=.5863cmw6 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:04 compute-0 sudo[95837]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:04 compute-0 python3.9[95990]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:05 compute-0 sshd-session[96015]: Invalid user ubuntu from 45.148.10.240 port 59044
Feb 23 10:40:06 compute-0 sshd-session[96015]: Connection closed by invalid user ubuntu 45.148.10.240 port 59044 [preauth]
Feb 23 10:40:06 compute-0 sudo[96415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qthfespzkmlnlfhjwlbacfbjvltazdkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843206.1631258-1059-223082660061608/AnsiballZ_container_config_data.py'
Feb 23 10:40:06 compute-0 sudo[96415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:06 compute-0 sshd-session[96288]: Invalid user ubuntu from 45.148.10.240 port 59058
Feb 23 10:40:06 compute-0 python3.9[96418]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 23 10:40:06 compute-0 sudo[96415]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:06 compute-0 sshd-session[96288]: Connection closed by invalid user ubuntu 45.148.10.240 port 59058 [preauth]
Feb 23 10:40:07 compute-0 sudo[96568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsakygycawvopyobabjzzmmcvpeixbpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843207.0432756-1081-52907403648439/AnsiballZ_container_config_hash.py'
Feb 23 10:40:07 compute-0 sudo[96568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:07 compute-0 python3.9[96571]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 10:40:07 compute-0 sudo[96568]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:08 compute-0 sudo[96723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mytmsvauwzaxdagvafiwfmnzxqlbufgf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843207.962175-1101-260959665545790/AnsiballZ_edpm_container_manage.py'
Feb 23 10:40:08 compute-0 sudo[96723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:08 compute-0 python3[96726]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 10:40:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:40:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:40:08 compute-0 sshd-session[96596]: Invalid user ubuntu from 45.148.10.240 port 39036
Feb 23 10:40:08 compute-0 podman[96762]: 2026-02-23 10:40:08.862791725 +0000 UTC m=+0.047400127 container create 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 23 10:40:08 compute-0 podman[96762]: 2026-02-23 10:40:08.836592555 +0000 UTC m=+0.021200977 image pull bfb93be9d83c3121be0312d4d8c02944841d931c726f68b412221913286262d4 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 23 10:40:08 compute-0 python3[96726]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 23 10:40:08 compute-0 sudo[96723]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:09 compute-0 sshd-session[96596]: Connection closed by invalid user ubuntu 45.148.10.240 port 39036 [preauth]
Feb 23 10:40:09 compute-0 sudo[96950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whxjxqydkaqrqxuofocbawihegitwlwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843209.194621-1117-103932435091675/AnsiballZ_stat.py'
Feb 23 10:40:09 compute-0 sudo[96950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:09 compute-0 python3.9[96953]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:40:09 compute-0 sudo[96950]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:09 compute-0 sshd-session[96846]: Invalid user ubuntu from 45.148.10.240 port 39052
Feb 23 10:40:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 10:40:09 compute-0 sshd-session[96846]: Connection closed by invalid user ubuntu 45.148.10.240 port 39052 [preauth]
Feb 23 10:40:10 compute-0 sudo[97105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaizbcroypgncsizwzupdjsqfwgspxpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843209.9982436-1135-105146191771879/AnsiballZ_file.py'
Feb 23 10:40:10 compute-0 sudo[97105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:10 compute-0 sshd-session[97109]: Connection closed by authenticating user root 165.227.79.48 port 56614 [preauth]
Feb 23 10:40:10 compute-0 python3.9[97108]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:10 compute-0 sudo[97105]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:10 compute-0 sudo[97184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xakcokoxbifjklvgyobgdildltqzjvda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843209.9982436-1135-105146191771879/AnsiballZ_stat.py'
Feb 23 10:40:10 compute-0 sudo[97184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:10 compute-0 python3.9[97187]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:40:10 compute-0 sudo[97184]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:11 compute-0 sudo[97338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqaukpeqwloaqlwmvnkackxdmlweenxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843210.909851-1135-247623157930381/AnsiballZ_copy.py'
Feb 23 10:40:11 compute-0 sudo[97338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:11 compute-0 python3.9[97341]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771843210.909851-1135-247623157930381/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:11 compute-0 sudo[97338]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:11 compute-0 sshd-session[97240]: Invalid user ubuntu from 45.148.10.240 port 39064
Feb 23 10:40:11 compute-0 sshd-session[97240]: Connection closed by invalid user ubuntu 45.148.10.240 port 39064 [preauth]
Feb 23 10:40:11 compute-0 sudo[97415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgdbvnoxzeochrvgibvevothtidodmej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843210.909851-1135-247623157930381/AnsiballZ_systemd.py'
Feb 23 10:40:11 compute-0 sudo[97415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:11 compute-0 python3.9[97418]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:40:11 compute-0 systemd[1]: Reloading.
Feb 23 10:40:12 compute-0 systemd-rc-local-generator[97437]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:40:12 compute-0 systemd-sysv-generator[97446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:40:12 compute-0 sudo[97415]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:12 compute-0 sudo[97537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqhhecdvmeqyvccxjazlpjcwfbhuqqyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843210.909851-1135-247623157930381/AnsiballZ_systemd.py'
Feb 23 10:40:12 compute-0 sudo[97537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:12 compute-0 python3.9[97540]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:40:12 compute-0 systemd[1]: Reloading.
Feb 23 10:40:12 compute-0 systemd-rc-local-generator[97563]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:40:12 compute-0 systemd-sysv-generator[97569]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:40:12 compute-0 systemd[1]: Starting ovn_controller container...
Feb 23 10:40:13 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 23 10:40:13 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:40:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ed40ebdf2a0c57ae70e6958489dd9e6ef2b8a379c7e0d26d4dce695da44a5aa/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 23 10:40:13 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1.
Feb 23 10:40:13 compute-0 podman[97588]: 2026-02-23 10:40:13.055899488 +0000 UTC m=+0.094707111 container init 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + sudo -E kolla_set_configs
Feb 23 10:40:13 compute-0 podman[97588]: 2026-02-23 10:40:13.088687454 +0000 UTC m=+0.127494997 container start 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:40:13 compute-0 edpm-start-podman-container[97588]: ovn_controller
Feb 23 10:40:13 compute-0 systemd[1]: Created slice User Slice of UID 0.
Feb 23 10:40:13 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 23 10:40:13 compute-0 edpm-start-podman-container[97587]: Creating additional drop-in dependency for "ovn_controller" (13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1)
Feb 23 10:40:13 compute-0 podman[97610]: 2026-02-23 10:40:13.147595407 +0000 UTC m=+0.049804601 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller)
Feb 23 10:40:13 compute-0 systemd[1]: 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1-473bb4c2d8c36203.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 10:40:13 compute-0 systemd[1]: 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1-473bb4c2d8c36203.service: Failed with result 'exit-code'.
Feb 23 10:40:13 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 23 10:40:13 compute-0 systemd[1]: Starting User Manager for UID 0...
Feb 23 10:40:13 compute-0 systemd[1]: Reloading.
Feb 23 10:40:13 compute-0 systemd-rc-local-generator[97674]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:40:13 compute-0 systemd-sysv-generator[97679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:40:13 compute-0 systemd[1]: Started ovn_controller container.
Feb 23 10:40:13 compute-0 sshd-session[97510]: Invalid user ubuntu from 45.148.10.240 port 39072
Feb 23 10:40:13 compute-0 systemd[97650]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 23 10:40:13 compute-0 sudo[97537]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:13 compute-0 systemd[97650]: Queued start job for default target Main User Target.
Feb 23 10:40:13 compute-0 systemd[97650]: Created slice User Application Slice.
Feb 23 10:40:13 compute-0 systemd[97650]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 23 10:40:13 compute-0 systemd[97650]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 10:40:13 compute-0 systemd[97650]: Reached target Paths.
Feb 23 10:40:13 compute-0 systemd[97650]: Reached target Timers.
Feb 23 10:40:13 compute-0 systemd[97650]: Starting D-Bus User Message Bus Socket...
Feb 23 10:40:13 compute-0 systemd[97650]: Starting Create User's Volatile Files and Directories...
Feb 23 10:40:13 compute-0 sshd-session[97510]: Connection closed by invalid user ubuntu 45.148.10.240 port 39072 [preauth]
Feb 23 10:40:13 compute-0 systemd[97650]: Listening on D-Bus User Message Bus Socket.
Feb 23 10:40:13 compute-0 systemd[97650]: Reached target Sockets.
Feb 23 10:40:13 compute-0 systemd[97650]: Finished Create User's Volatile Files and Directories.
Feb 23 10:40:13 compute-0 systemd[97650]: Reached target Basic System.
Feb 23 10:40:13 compute-0 systemd[97650]: Reached target Main User Target.
Feb 23 10:40:13 compute-0 systemd[97650]: Startup finished in 97ms.
Feb 23 10:40:13 compute-0 systemd[1]: Started User Manager for UID 0.
Feb 23 10:40:13 compute-0 systemd[1]: Started Session c1 of User root.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 10:40:13 compute-0 ovn_controller[97601]: INFO:__main__:Validating config file
Feb 23 10:40:13 compute-0 ovn_controller[97601]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 10:40:13 compute-0 ovn_controller[97601]: INFO:__main__:Writing out command to execute
Feb 23 10:40:13 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: ++ cat /run_command
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + ARGS=
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + sudo kolla_copy_cacerts
Feb 23 10:40:13 compute-0 systemd[1]: Started Session c2 of User root.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + [[ ! -n '' ]]
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + . kolla_extend_start
Feb 23 10:40:13 compute-0 ovn_controller[97601]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + umask 0022
Feb 23 10:40:13 compute-0 ovn_controller[97601]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 23 10:40:13 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6088] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6094] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <warn>  [1771843213.6097] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6102] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6108] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6110] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 23 10:40:13 compute-0 kernel: br-int: entered promiscuous mode
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00024|main|INFO|OVS feature set changed, force recompute.
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 10:40:13 compute-0 ovn_controller[97601]: 2026-02-23T10:40:13Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6395] manager: (ovn-c6a037-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 23 10:40:13 compute-0 systemd-udevd[97742]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:40:13 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6558] device (genev_sys_6081): carrier: link connected
Feb 23 10:40:13 compute-0 NetworkManager[57207]: <info>  [1771843213.6560] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 23 10:40:14 compute-0 NetworkManager[57207]: <info>  [1771843214.2538] manager: (ovn-48738a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb 23 10:40:14 compute-0 sshd-session[97746]: Invalid user ubuntu from 45.148.10.240 port 39076
Feb 23 10:40:14 compute-0 sshd-session[97746]: Connection closed by invalid user ubuntu 45.148.10.240 port 39076 [preauth]
Feb 23 10:40:14 compute-0 python3.9[97873]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 10:40:15 compute-0 sudo[98025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fipmalmhiordmecglthfcclkkhzzdiaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843215.5949614-1225-106492473191190/AnsiballZ_stat.py'
Feb 23 10:40:15 compute-0 sudo[98025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:16 compute-0 python3.9[98028]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:16 compute-0 sudo[98025]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:16 compute-0 sshd-session[97898]: Invalid user ubuntu from 45.148.10.240 port 39082
Feb 23 10:40:16 compute-0 sudo[98149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oisuytxjulqxquzuvhkdroxxabxrnigw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843215.5949614-1225-106492473191190/AnsiballZ_copy.py'
Feb 23 10:40:16 compute-0 sudo[98149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:16 compute-0 sshd-session[97898]: Connection closed by invalid user ubuntu 45.148.10.240 port 39082 [preauth]
Feb 23 10:40:16 compute-0 python3.9[98152]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843215.5949614-1225-106492473191190/.source.yaml _original_basename=.4xlbhdr1 follow=False checksum=016f66a96e2ff08d5fd9c3b91f79c30be0d5562a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:16 compute-0 sudo[98149]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:17 compute-0 sudo[98304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmgyxgyifrfggsbudenocuwuqhvbeufx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843216.8527043-1255-144799408423182/AnsiballZ_command.py'
Feb 23 10:40:17 compute-0 sudo[98304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:17 compute-0 python3.9[98307]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:40:17 compute-0 ovs-vsctl[98308]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 23 10:40:17 compute-0 sudo[98304]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:17 compute-0 sshd-session[98229]: Invalid user ubuntu from 45.148.10.240 port 34926
Feb 23 10:40:17 compute-0 sshd-session[98229]: Connection closed by invalid user ubuntu 45.148.10.240 port 34926 [preauth]
Feb 23 10:40:17 compute-0 sudo[98458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqjuwchlqcxlsliscmdmhjsjrlbhlbkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843217.6194353-1271-17759821834897/AnsiballZ_command.py'
Feb 23 10:40:17 compute-0 sudo[98458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:18 compute-0 python3.9[98461]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:40:18 compute-0 ovs-vsctl[98463]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 23 10:40:18 compute-0 sudo[98458]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:18 compute-0 sudo[98616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouwrejmyixydrylzwgbbaxklyhutjnqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843218.732023-1299-164763317785239/AnsiballZ_command.py'
Feb 23 10:40:18 compute-0 sudo[98616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:19 compute-0 python3.9[98619]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:40:19 compute-0 ovs-vsctl[98620]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 23 10:40:19 compute-0 sudo[98616]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:19 compute-0 sshd-session[98489]: Invalid user ubuntu from 45.148.10.240 port 34940
Feb 23 10:40:19 compute-0 sshd-session[98489]: Connection closed by invalid user ubuntu 45.148.10.240 port 34940 [preauth]
Feb 23 10:40:19 compute-0 sshd-session[86946]: Connection closed by 192.168.122.30 port 39396
Feb 23 10:40:19 compute-0 sshd-session[86943]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:40:19 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Feb 23 10:40:19 compute-0 systemd[1]: session-20.scope: Consumed 37.255s CPU time.
Feb 23 10:40:19 compute-0 systemd-logind[808]: Session 20 logged out. Waiting for processes to exit.
Feb 23 10:40:19 compute-0 systemd-logind[808]: Removed session 20.
Feb 23 10:40:20 compute-0 sshd-session[98645]: Invalid user ubuntu from 45.148.10.240 port 34946
Feb 23 10:40:20 compute-0 sshd-session[98645]: Connection closed by invalid user ubuntu 45.148.10.240 port 34946 [preauth]
Feb 23 10:40:22 compute-0 sshd-session[98647]: Invalid user ubuntu from 45.148.10.240 port 34950
Feb 23 10:40:22 compute-0 sshd-session[98647]: Connection closed by invalid user ubuntu 45.148.10.240 port 34950 [preauth]
Feb 23 10:40:23 compute-0 sshd-session[98649]: Invalid user ubuntu from 45.148.10.240 port 34956
Feb 23 10:40:23 compute-0 sshd-session[98649]: Connection closed by invalid user ubuntu 45.148.10.240 port 34956 [preauth]
Feb 23 10:40:23 compute-0 systemd[1]: Stopping User Manager for UID 0...
Feb 23 10:40:23 compute-0 systemd[97650]: Activating special unit Exit the Session...
Feb 23 10:40:23 compute-0 systemd[97650]: Stopped target Main User Target.
Feb 23 10:40:23 compute-0 systemd[97650]: Stopped target Basic System.
Feb 23 10:40:23 compute-0 systemd[97650]: Stopped target Paths.
Feb 23 10:40:23 compute-0 systemd[97650]: Stopped target Sockets.
Feb 23 10:40:23 compute-0 systemd[97650]: Stopped target Timers.
Feb 23 10:40:23 compute-0 systemd[97650]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 10:40:23 compute-0 systemd[97650]: Closed D-Bus User Message Bus Socket.
Feb 23 10:40:23 compute-0 systemd[97650]: Stopped Create User's Volatile Files and Directories.
Feb 23 10:40:23 compute-0 systemd[97650]: Removed slice User Application Slice.
Feb 23 10:40:23 compute-0 systemd[97650]: Reached target Shutdown.
Feb 23 10:40:23 compute-0 systemd[97650]: Finished Exit the Session.
Feb 23 10:40:23 compute-0 systemd[97650]: Reached target Exit the Session.
Feb 23 10:40:23 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Feb 23 10:40:23 compute-0 systemd[1]: Stopped User Manager for UID 0.
Feb 23 10:40:23 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 23 10:40:23 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 23 10:40:23 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 23 10:40:23 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 23 10:40:23 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Feb 23 10:40:25 compute-0 sshd-session[98654]: Invalid user ubuntu from 45.148.10.240 port 34962
Feb 23 10:40:25 compute-0 sshd-session[98654]: Connection closed by invalid user ubuntu 45.148.10.240 port 34962 [preauth]
Feb 23 10:40:25 compute-0 sshd-session[98656]: Accepted publickey for zuul from 192.168.122.30 port 51302 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:40:25 compute-0 systemd-logind[808]: New session 22 of user zuul.
Feb 23 10:40:25 compute-0 systemd[1]: Started Session 22 of User zuul.
Feb 23 10:40:25 compute-0 sshd-session[98656]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:40:26 compute-0 python3.9[98810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:40:27 compute-0 sudo[98967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkptstpucuewpdaqafoheanyevamlhys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843227.1370494-43-35453528437888/AnsiballZ_file.py'
Feb 23 10:40:27 compute-0 sudo[98967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:27 compute-0 sshd-session[98712]: Invalid user ubuntu from 45.148.10.240 port 34976
Feb 23 10:40:27 compute-0 python3.9[98970]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:27 compute-0 sudo[98967]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:27 compute-0 sshd-session[98712]: Connection closed by invalid user ubuntu 45.148.10.240 port 34976 [preauth]
Feb 23 10:40:28 compute-0 sudo[99120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecqbvlxjcivzoshsffdhvmskbniijaah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843227.9510849-43-109702902972326/AnsiballZ_file.py'
Feb 23 10:40:28 compute-0 sudo[99120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:28 compute-0 sshd-session[98892]: Invalid user ubuntu from 45.148.10.240 port 56358
Feb 23 10:40:28 compute-0 python3.9[99123]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:28 compute-0 sudo[99120]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:28 compute-0 sshd-session[98892]: Connection closed by invalid user ubuntu 45.148.10.240 port 56358 [preauth]
Feb 23 10:40:28 compute-0 sudo[99273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccxxaayqcszlnlcxeakejloqyeeclkbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843228.5804458-43-165277694310536/AnsiballZ_file.py'
Feb 23 10:40:28 compute-0 sudo[99273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:29 compute-0 python3.9[99276]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:29 compute-0 sudo[99273]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:29 compute-0 sudo[99428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nelntzpknhujagdxvseqshbibaqwzpcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843229.1783404-43-102225258537572/AnsiballZ_file.py'
Feb 23 10:40:29 compute-0 sudo[99428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:29 compute-0 sshd-session[99277]: Invalid user sdadmin from 45.148.10.240 port 56366
Feb 23 10:40:29 compute-0 python3.9[99431]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:29 compute-0 sudo[99428]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:29 compute-0 sshd-session[99277]: Connection closed by invalid user sdadmin 45.148.10.240 port 56366 [preauth]
Feb 23 10:40:29 compute-0 sudo[99581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iasrbnabtrucpqavmkantpxhrqktifgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843229.7174115-43-163696543208246/AnsiballZ_file.py'
Feb 23 10:40:29 compute-0 sudo[99581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:30 compute-0 python3.9[99584]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:30 compute-0 sudo[99581]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:30 compute-0 sshd-session[99592]: Connection closed by authenticating user root 143.198.30.3 port 34204 [preauth]
Feb 23 10:40:31 compute-0 sshd-session[99611]: Invalid user ubuntu from 45.148.10.240 port 56378
Feb 23 10:40:31 compute-0 python3.9[99738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:40:31 compute-0 sshd-session[99611]: Connection closed by invalid user ubuntu 45.148.10.240 port 56378 [preauth]
Feb 23 10:40:31 compute-0 sudo[99889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfjkepykmtqbdmjheuazmnfpoajeiexh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843231.400099-131-99497859968398/AnsiballZ_seboolean.py'
Feb 23 10:40:31 compute-0 sudo[99889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:31 compute-0 python3.9[99892]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 23 10:40:32 compute-0 sudo[99889]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:32 compute-0 sshd-session[99893]: Invalid user admin from 45.148.10.240 port 56386
Feb 23 10:40:32 compute-0 sshd-session[99893]: Connection closed by invalid user admin 45.148.10.240 port 56386 [preauth]
Feb 23 10:40:33 compute-0 python3.9[100045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:33 compute-0 sshd-session[100021]: Invalid user sdadmin from 45.148.10.240 port 56388
Feb 23 10:40:34 compute-0 python3.9[100167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843232.7736537-147-255103962508444/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:34 compute-0 sshd-session[100021]: Connection closed by invalid user sdadmin 45.148.10.240 port 56388 [preauth]
Feb 23 10:40:34 compute-0 python3.9[100317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:35 compute-0 python3.9[100438]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843234.2477584-177-101877169184899/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:35 compute-0 sudo[100590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scqchdwneocfziaqhtngxvrarrlbhlkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843235.6004944-211-109009237738766/AnsiballZ_setup.py'
Feb 23 10:40:35 compute-0 sudo[100590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:35 compute-0 sshd-session[100463]: Invalid user admin from 45.148.10.240 port 56396
Feb 23 10:40:36 compute-0 sshd-session[100463]: Connection closed by invalid user admin 45.148.10.240 port 56396 [preauth]
Feb 23 10:40:36 compute-0 python3.9[100593]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:40:36 compute-0 sudo[100590]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:36 compute-0 sudo[100677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shmhvqcgqzkrbznngmowsgzfxvzryxkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843235.6004944-211-109009237738766/AnsiballZ_dnf.py'
Feb 23 10:40:36 compute-0 sudo[100677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:36 compute-0 python3.9[100680]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:40:37 compute-0 sshd-session[100602]: Invalid user admin from 45.148.10.240 port 47696
Feb 23 10:40:37 compute-0 sshd-session[100602]: Connection closed by invalid user admin 45.148.10.240 port 47696 [preauth]
Feb 23 10:40:38 compute-0 sudo[100677]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:38 compute-0 sudo[100833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejukcvjitafbgsvwqoxgufvcexhqjgic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843238.3416207-235-62743026366009/AnsiballZ_systemd.py'
Feb 23 10:40:38 compute-0 sudo[100833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:38 compute-0 sshd-session[100758]: Invalid user admin from 45.148.10.240 port 47700
Feb 23 10:40:39 compute-0 sshd-session[100758]: Connection closed by invalid user admin 45.148.10.240 port 47700 [preauth]
Feb 23 10:40:39 compute-0 python3.9[100836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:40:39 compute-0 sudo[100833]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:39 compute-0 python3.9[100989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:40 compute-0 python3.9[101110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843239.4582965-251-101587544352626/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:40 compute-0 python3.9[101260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:41 compute-0 python3.9[101381]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843240.5321422-251-64451143186086/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:42 compute-0 python3.9[101531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:43 compute-0 python3.9[101652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843242.3167608-339-63458751611751/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:43 compute-0 ovn_controller[97601]: 2026-02-23T10:40:43Z|00025|memory|INFO|16256 kB peak resident set size after 30.0 seconds
Feb 23 10:40:43 compute-0 ovn_controller[97601]: 2026-02-23T10:40:43Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Feb 23 10:40:43 compute-0 podman[101776]: 2026-02-23 10:40:43.598204807 +0000 UTC m=+0.086096101 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 10:40:43 compute-0 python3.9[101817]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:44 compute-0 python3.9[101949]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843243.2797577-339-155671280770482/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:45 compute-0 python3.9[102099]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:40:45 compute-0 sudo[102251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnumdfzdjvgsvnosxvqatfkthkmxzrzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843245.4544396-415-261770264347427/AnsiballZ_file.py'
Feb 23 10:40:45 compute-0 sudo[102251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:45 compute-0 python3.9[102254]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:45 compute-0 sudo[102251]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:46 compute-0 sudo[102404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkxgczclipedgltnawxpwblrtvglsdin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843246.159754-431-148930478454857/AnsiballZ_stat.py'
Feb 23 10:40:46 compute-0 sudo[102404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:46 compute-0 python3.9[102407]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:46 compute-0 sudo[102404]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:46 compute-0 sudo[102483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqbybrkxttwfnstyxtcgfvzwnxxloisq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843246.159754-431-148930478454857/AnsiballZ_file.py'
Feb 23 10:40:46 compute-0 sudo[102483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:47 compute-0 python3.9[102486]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:47 compute-0 sudo[102483]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:47 compute-0 sudo[102636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoyytmecobemczvxiqbqmsunwucilcwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843247.1561852-431-201270700388805/AnsiballZ_stat.py'
Feb 23 10:40:47 compute-0 sudo[102636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:47 compute-0 python3.9[102639]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:47 compute-0 sudo[102636]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:47 compute-0 sudo[102715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slnpcdpetfjmzhylvombehjwhmhtrsfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843247.1561852-431-201270700388805/AnsiballZ_file.py'
Feb 23 10:40:47 compute-0 sudo[102715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:48 compute-0 python3.9[102718]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:48 compute-0 sudo[102715]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:48 compute-0 sudo[102868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgwvjylqmcqjbtsumuuxfbznrbfkjahv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843248.2134066-477-183459666201540/AnsiballZ_file.py'
Feb 23 10:40:48 compute-0 sudo[102868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:48 compute-0 python3.9[102871]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:48 compute-0 sudo[102868]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:49 compute-0 sudo[103021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfmtendhohdgrncsluycljqqxsrgpoyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843248.8713932-493-266368233928859/AnsiballZ_stat.py'
Feb 23 10:40:49 compute-0 sudo[103021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:49 compute-0 python3.9[103024]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:49 compute-0 sudo[103021]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:49 compute-0 sudo[103100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmtthtlpiwgnrtftmtxsiqsjheejryln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843248.8713932-493-266368233928859/AnsiballZ_file.py'
Feb 23 10:40:49 compute-0 sudo[103100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:49 compute-0 python3.9[103103]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:49 compute-0 sudo[103100]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:50 compute-0 sudo[103253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zofbgardmccfxqnurvpkelsatzyjbiuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843249.952754-517-205917023145614/AnsiballZ_stat.py'
Feb 23 10:40:50 compute-0 sudo[103253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:50 compute-0 python3.9[103256]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:50 compute-0 sudo[103253]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:50 compute-0 sudo[103332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbanvcyjiygrltwvejdqpsgdeprwxsup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843249.952754-517-205917023145614/AnsiballZ_file.py'
Feb 23 10:40:50 compute-0 sudo[103332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:50 compute-0 python3.9[103335]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:50 compute-0 sudo[103332]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:51 compute-0 sudo[103485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymczsbjnzbjxllxycyxzxqypmttowdeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843251.1796408-541-21274786622616/AnsiballZ_systemd.py'
Feb 23 10:40:51 compute-0 sudo[103485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:51 compute-0 python3.9[103488]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:40:51 compute-0 systemd[1]: Reloading.
Feb 23 10:40:51 compute-0 systemd-rc-local-generator[103511]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:40:51 compute-0 systemd-sysv-generator[103516]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:40:52 compute-0 sudo[103485]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:52 compute-0 sudo[103681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaozerklvveunspjaxjtjnobspzjaddn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843252.309907-557-83534412210060/AnsiballZ_stat.py'
Feb 23 10:40:52 compute-0 sudo[103681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:52 compute-0 python3.9[103684]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:52 compute-0 sudo[103681]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:52 compute-0 sudo[103760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blebbdsckxqjqrdcztnvxptmutjvmrhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843252.309907-557-83534412210060/AnsiballZ_file.py'
Feb 23 10:40:52 compute-0 sudo[103760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:53 compute-0 python3.9[103763]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:53 compute-0 sudo[103760]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:53 compute-0 sudo[103913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idwkvovovlfkikupuiuslooubznduyjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843253.5371704-581-197145283030119/AnsiballZ_stat.py'
Feb 23 10:40:53 compute-0 sudo[103913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:53 compute-0 python3.9[103916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:53 compute-0 sudo[103913]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:54 compute-0 sudo[103992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-banqhldrkudtweaaxyepineoqwxpknow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843253.5371704-581-197145283030119/AnsiballZ_file.py'
Feb 23 10:40:54 compute-0 sudo[103992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:54 compute-0 python3.9[103995]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:54 compute-0 sudo[103992]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:54 compute-0 sudo[104145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izatdxyzylqfuntktrpgmqyedzcpynex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843254.6787057-605-210767369274900/AnsiballZ_systemd.py'
Feb 23 10:40:54 compute-0 sudo[104145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:55 compute-0 python3.9[104148]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:40:55 compute-0 systemd[1]: Reloading.
Feb 23 10:40:55 compute-0 systemd-rc-local-generator[104172]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:40:55 compute-0 systemd-sysv-generator[104178]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:40:55 compute-0 systemd[1]: Starting Create netns directory...
Feb 23 10:40:55 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 10:40:55 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 10:40:55 compute-0 systemd[1]: Finished Create netns directory.
Feb 23 10:40:55 compute-0 sudo[104145]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:56 compute-0 sudo[104346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iafeamyelvvvqzbxsbsmnscdqgtgvhqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843256.0121584-625-220968647878999/AnsiballZ_file.py'
Feb 23 10:40:56 compute-0 sudo[104346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:56 compute-0 python3.9[104349]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:56 compute-0 sudo[104346]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:56 compute-0 sudo[104499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chulkpwyjncezacctzibjigiwochqnrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843256.6203594-641-4936180547821/AnsiballZ_stat.py'
Feb 23 10:40:56 compute-0 sudo[104499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:57 compute-0 python3.9[104502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:40:57 compute-0 sudo[104499]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:57 compute-0 sudo[104623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlvwvabonntadanvpybrrejzsrbuxbjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843256.6203594-641-4936180547821/AnsiballZ_copy.py'
Feb 23 10:40:57 compute-0 sudo[104623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:57 compute-0 python3.9[104626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843256.6203594-641-4936180547821/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:57 compute-0 sudo[104623]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:58 compute-0 sudo[104776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntmdwhmikfqramesbrcaxpstpapgbhfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843258.2664278-675-188367306110108/AnsiballZ_file.py'
Feb 23 10:40:58 compute-0 sudo[104776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:58 compute-0 python3.9[104779]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:40:58 compute-0 sudo[104776]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:59 compute-0 sshd-session[104849]: Connection closed by authenticating user root 165.227.79.48 port 44172 [preauth]
Feb 23 10:40:59 compute-0 sudo[104931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxkfgflbmqvutmgikobezxxvbbpwnolc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843258.979455-691-39468741706701/AnsiballZ_file.py'
Feb 23 10:40:59 compute-0 sudo[104931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:40:59 compute-0 python3.9[104934]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:40:59 compute-0 sudo[104931]: pam_unix(sudo:session): session closed for user root
Feb 23 10:40:59 compute-0 sudo[105084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukahwhfjnaitjgckanvtseehdhxtyrpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843259.587514-707-229745183682775/AnsiballZ_stat.py'
Feb 23 10:40:59 compute-0 sudo[105084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:00 compute-0 python3.9[105087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:41:00 compute-0 sudo[105084]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:00 compute-0 sudo[105208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djahaaxzphvqztjmwwfuimorwpsepzsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843259.587514-707-229745183682775/AnsiballZ_copy.py'
Feb 23 10:41:00 compute-0 sudo[105208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:00 compute-0 python3.9[105211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843259.587514-707-229745183682775/.source.json _original_basename=.7adtsfkv follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:00 compute-0 sudo[105208]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:01 compute-0 python3.9[105361]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:03 compute-0 sudo[105782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgbehhjcpokvgafkcpzmykcynhgiierg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843262.764438-787-105309177830191/AnsiballZ_container_config_data.py'
Feb 23 10:41:03 compute-0 sudo[105782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:03 compute-0 sshd-session[105786]: Connection closed by authenticating user root 143.198.30.3 port 42978 [preauth]
Feb 23 10:41:03 compute-0 python3.9[105785]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 23 10:41:03 compute-0 sudo[105782]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:04 compute-0 sudo[105937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iurfajxkhkhpvdeqxjnmzmvorrxtaxxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843263.7766926-809-140886889601685/AnsiballZ_container_config_hash.py'
Feb 23 10:41:04 compute-0 sudo[105937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:04 compute-0 python3.9[105940]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 10:41:04 compute-0 sudo[105937]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:05 compute-0 sudo[106090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvdkgvagsotligmqkptkdymrpyemxwqz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843264.663113-829-52144759270408/AnsiballZ_edpm_container_manage.py'
Feb 23 10:41:05 compute-0 sudo[106090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:05 compute-0 python3[106093]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 10:41:05 compute-0 podman[106128]: 2026-02-23 10:41:05.508263057 +0000 UTC m=+0.021731681 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 10:41:05 compute-0 podman[106128]: 2026-02-23 10:41:05.623906257 +0000 UTC m=+0.137374871 container create 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 23 10:41:05 compute-0 python3[106093]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 10:41:06 compute-0 sudo[106090]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:07 compute-0 sudo[106316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcqexhtksyotuclcjbrdjpkdechjxvyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843266.910921-845-123296513602375/AnsiballZ_stat.py'
Feb 23 10:41:07 compute-0 sudo[106316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:07 compute-0 python3.9[106319]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:41:07 compute-0 sudo[106316]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:08 compute-0 sudo[106471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plrkmeabwoqilwulltksixqrlqldjjnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843267.6184466-863-133283075849403/AnsiballZ_file.py'
Feb 23 10:41:08 compute-0 sudo[106471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:08 compute-0 python3.9[106474]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:08 compute-0 sudo[106471]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:08 compute-0 sudo[106548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyxyvsttkgmfyvbcckytmkdhxnoxxflr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843267.6184466-863-133283075849403/AnsiballZ_stat.py'
Feb 23 10:41:08 compute-0 sudo[106548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:08 compute-0 python3.9[106551]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:41:08 compute-0 sudo[106548]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:09 compute-0 sudo[106700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqwnskddxgijqwkhunqwyowvcutjatme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843268.6901128-863-86414769654464/AnsiballZ_copy.py'
Feb 23 10:41:09 compute-0 sudo[106700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:09 compute-0 python3.9[106703]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771843268.6901128-863-86414769654464/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:09 compute-0 sudo[106700]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:09 compute-0 sudo[106777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zskhflhzhqcdxogsjixcelwasauxzirs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843268.6901128-863-86414769654464/AnsiballZ_systemd.py'
Feb 23 10:41:09 compute-0 sudo[106777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:09 compute-0 python3.9[106780]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:41:09 compute-0 systemd[1]: Reloading.
Feb 23 10:41:09 compute-0 systemd-rc-local-generator[106801]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:41:09 compute-0 systemd-sysv-generator[106805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:41:10 compute-0 sudo[106777]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:10 compute-0 sudo[106895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amjeazvzvjgrygpllzhrsjtqvoxjwcvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843268.6901128-863-86414769654464/AnsiballZ_systemd.py'
Feb 23 10:41:10 compute-0 sudo[106895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:10 compute-0 python3.9[106898]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:10 compute-0 systemd[1]: Reloading.
Feb 23 10:41:10 compute-0 systemd-sysv-generator[106929]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:41:10 compute-0 systemd-rc-local-generator[106926]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:41:10 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Feb 23 10:41:10 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:41:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/844a19260e4b1270d384e077173bf1100d6e848baa24f99924b97cfac7e88811/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 23 10:41:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/844a19260e4b1270d384e077173bf1100d6e848baa24f99924b97cfac7e88811/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:41:10 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7.
Feb 23 10:41:10 compute-0 podman[106947]: 2026-02-23 10:41:10.94242411 +0000 UTC m=+0.123764916 container init 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 23 10:41:10 compute-0 ovn_metadata_agent[106963]: + sudo -E kolla_set_configs
Feb 23 10:41:10 compute-0 podman[106947]: 2026-02-23 10:41:10.968148534 +0000 UTC m=+0.149489310 container start 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:41:10 compute-0 edpm-start-podman-container[106947]: ovn_metadata_agent
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Validating config file
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Copying service configuration files
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Writing out command to execute
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 23 10:41:11 compute-0 podman[106970]: 2026-02-23 10:41:11.025845817 +0000 UTC m=+0.049162010 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 23 10:41:11 compute-0 edpm-start-podman-container[106946]: Creating additional drop-in dependency for "ovn_metadata_agent" (7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7)
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: ++ cat /run_command
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + CMD=neutron-ovn-metadata-agent
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + ARGS=
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + sudo kolla_copy_cacerts
Feb 23 10:41:11 compute-0 systemd[1]: Reloading.
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + [[ ! -n '' ]]
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + . kolla_extend_start
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + umask 0022
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: + exec neutron-ovn-metadata-agent
Feb 23 10:41:11 compute-0 ovn_metadata_agent[106963]: Running command: 'neutron-ovn-metadata-agent'
Feb 23 10:41:11 compute-0 systemd-rc-local-generator[107041]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:41:11 compute-0 systemd-sysv-generator[107045]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:41:11 compute-0 systemd[1]: Started ovn_metadata_agent container.
Feb 23 10:41:11 compute-0 sudo[106895]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:12 compute-0 python3.9[107208]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.584 106968 INFO neutron.common.config [-] Logging enabled!
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.584 106968 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.584 106968 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.585 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.585 106968 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.585 106968 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.585 106968 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.585 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.586 106968 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.587 106968 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.588 106968 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.589 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.590 106968 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.591 106968 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.592 106968 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.593 106968 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.594 106968 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.595 106968 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.596 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.597 106968 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.598 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.599 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.600 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.601 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.602 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.603 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.604 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.605 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.606 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.607 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.608 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.609 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.610 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.611 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.612 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.613 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.614 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.615 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.616 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.617 106968 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.626 106968 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.626 106968 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.626 106968 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.627 106968 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.627 106968 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.639 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 260ff7a6-2911-481e-914f-54dc92f9c3bf (UUID: 260ff7a6-2911-481e-914f-54dc92f9c3bf) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.663 106968 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.664 106968 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.664 106968 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.664 106968 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.667 106968 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.672 106968 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.678 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '260ff7a6-2911-481e-914f-54dc92f9c3bf'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], external_ids={}, name=260ff7a6-2911-481e-914f-54dc92f9c3bf, nb_cfg_timestamp=1771843221641, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.678 106968 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f3c56d03df0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.679 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.679 106968 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.679 106968 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.680 106968 INFO oslo_service.service [-] Starting 1 workers
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.684 106968 DEBUG oslo_service.service [-] Started child 107278 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.686 106968 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpvqlzq6an/privsep.sock']
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.688 107278 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-159828'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.730 107278 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.731 107278 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.731 107278 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.738 107278 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.750 107278 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 23 10:41:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:12.758 107278 INFO eventlet.wsgi.server [-] (107278) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 23 10:41:12 compute-0 sudo[107363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oteoymnfazahywdgdvqwzuvdtksuhzyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843272.6368504-953-195648231624299/AnsiballZ_stat.py'
Feb 23 10:41:12 compute-0 sudo[107363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:13 compute-0 python3.9[107366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:41:13 compute-0 sudo[107363]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:13 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.253 106968 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.254 106968 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpvqlzq6an/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.163 107369 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.166 107369 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.170 107369 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.170 107369 INFO oslo.privsep.daemon [-] privsep daemon running as pid 107369
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.256 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[f100c317-ff6a-4c91-8a02-1612a52b4f1a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:41:13 compute-0 sudo[107494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnrivkunfjmkkiqcskyovffozngtqijt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843272.6368504-953-195648231624299/AnsiballZ_copy.py'
Feb 23 10:41:13 compute-0 sudo[107494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:13 compute-0 python3.9[107497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843272.6368504-953-195648231624299/.source.yaml _original_basename=.i3qzhcx7 follow=False checksum=373f508c30754196e0d6e33faf96a82ce180dabb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:13 compute-0 sudo[107494]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.717 107369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.717 107369 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:41:13 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:13.717 107369 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:41:13 compute-0 podman[107498]: 2026-02-23 10:41:13.759770331 +0000 UTC m=+0.066857874 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 23 10:41:14 compute-0 sshd-session[98659]: Connection closed by 192.168.122.30 port 51302
Feb 23 10:41:14 compute-0 sshd-session[98656]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:41:14 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Feb 23 10:41:14 compute-0 systemd[1]: session-22.scope: Consumed 28.843s CPU time.
Feb 23 10:41:14 compute-0 systemd-logind[808]: Session 22 logged out. Waiting for processes to exit.
Feb 23 10:41:14 compute-0 systemd-logind[808]: Removed session 22.
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.204 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[b554f2cf-c985-452d-b955-89a2fe081444]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.206 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, column=external_ids, values=({'neutron:ovn-metadata-id': '8cdaf2b3-222d-571b-bcff-8996b7640e1f'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.216 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.224 106968 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.225 106968 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.225 106968 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.225 106968 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.225 106968 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.225 106968 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.226 106968 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.226 106968 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.227 106968 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.227 106968 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.227 106968 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.227 106968 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.228 106968 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.228 106968 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.228 106968 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.228 106968 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.229 106968 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.229 106968 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.229 106968 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.230 106968 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.230 106968 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.230 106968 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.230 106968 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.231 106968 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.231 106968 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.232 106968 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.232 106968 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.232 106968 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.233 106968 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.233 106968 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.233 106968 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.233 106968 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.234 106968 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.234 106968 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.234 106968 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.235 106968 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.235 106968 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.235 106968 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.236 106968 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.236 106968 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.236 106968 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.236 106968 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.237 106968 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.237 106968 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.237 106968 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.237 106968 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.238 106968 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.239 106968 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.240 106968 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.241 106968 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.242 106968 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.243 106968 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.244 106968 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.245 106968 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.246 106968 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.247 106968 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.248 106968 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.249 106968 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.250 106968 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.251 106968 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.252 106968 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.253 106968 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.254 106968 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.255 106968 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.256 106968 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.257 106968 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.258 106968 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.259 106968 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.260 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.261 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.262 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.263 106968 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.264 106968 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.264 106968 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:41:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:41:14.264 106968 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 10:41:19 compute-0 sshd-session[107549]: Accepted publickey for zuul from 192.168.122.30 port 50462 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:41:19 compute-0 systemd-logind[808]: New session 23 of user zuul.
Feb 23 10:41:19 compute-0 systemd[1]: Started Session 23 of User zuul.
Feb 23 10:41:19 compute-0 sshd-session[107549]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:41:20 compute-0 python3.9[107702]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:41:21 compute-0 sudo[107856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpogeywhsalfdaqtzhthfxeyqhaghwur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843281.4350708-43-170439023921824/AnsiballZ_command.py'
Feb 23 10:41:21 compute-0 sudo[107856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:22 compute-0 python3.9[107859]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:22 compute-0 sudo[107856]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:22 compute-0 sudo[108022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaqvuloyvlmchbgcgfhcpoqmtilscatp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843282.454918-65-50546387591888/AnsiballZ_systemd_service.py'
Feb 23 10:41:22 compute-0 sudo[108022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:23 compute-0 python3.9[108025]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:41:23 compute-0 systemd[1]: Reloading.
Feb 23 10:41:23 compute-0 systemd-sysv-generator[108049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:41:23 compute-0 systemd-rc-local-generator[108045]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:41:23 compute-0 sudo[108022]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:24 compute-0 python3.9[108217]: ansible-ansible.builtin.service_facts Invoked
Feb 23 10:41:24 compute-0 network[108234]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 10:41:24 compute-0 network[108235]: 'network-scripts' will be removed from distribution in near future.
Feb 23 10:41:24 compute-0 network[108236]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 10:41:27 compute-0 sudo[108496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgmgfttgizdlqructrdxqgowwsgeupwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843287.6903157-103-213365721115343/AnsiballZ_systemd_service.py'
Feb 23 10:41:27 compute-0 sudo[108496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:28 compute-0 python3.9[108499]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:28 compute-0 sudo[108496]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:28 compute-0 sudo[108650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcjmlateiughlwoxeynwydzyfqixssve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843288.3707952-103-192066926530259/AnsiballZ_systemd_service.py'
Feb 23 10:41:28 compute-0 sudo[108650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:28 compute-0 python3.9[108653]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:28 compute-0 sudo[108650]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:29 compute-0 sudo[108804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avkwziqlbddhongjlfdvtcdkcztxqwni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843289.080634-103-77039907368076/AnsiballZ_systemd_service.py'
Feb 23 10:41:29 compute-0 sudo[108804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:29 compute-0 python3.9[108807]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:29 compute-0 sudo[108804]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:29 compute-0 sudo[108958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhuvenjgzverqmwwcveiupzfajpkjlni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843289.751474-103-224178834867554/AnsiballZ_systemd_service.py'
Feb 23 10:41:30 compute-0 sudo[108958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:30 compute-0 python3.9[108961]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:30 compute-0 sudo[108958]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:30 compute-0 sudo[109112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzumhmswnsplescrwmflwccpqsvphbuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843290.4269304-103-228957427049518/AnsiballZ_systemd_service.py'
Feb 23 10:41:30 compute-0 sudo[109112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:30 compute-0 python3.9[109115]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:30 compute-0 sudo[109112]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:31 compute-0 sudo[109266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsenronauoadvdfiwzkeaxlfwloowphh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843291.0685425-103-253201786125630/AnsiballZ_systemd_service.py'
Feb 23 10:41:31 compute-0 sudo[109266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:31 compute-0 python3.9[109269]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:31 compute-0 sudo[109266]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:31 compute-0 sudo[109420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-facxpjyyicpwiuumzcngfznmicnovbnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843291.7130163-103-264989333749555/AnsiballZ_systemd_service.py'
Feb 23 10:41:31 compute-0 sudo[109420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:32 compute-0 python3.9[109423]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:41:32 compute-0 sudo[109420]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:33 compute-0 sudo[109574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utpgajavwhsmgrowyzfirywiimvbkvqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843292.852181-207-100060029599040/AnsiballZ_file.py'
Feb 23 10:41:33 compute-0 sudo[109574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:33 compute-0 python3.9[109577]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:33 compute-0 sudo[109574]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:33 compute-0 sudo[109727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehjdtpopimvbokhuplvxjyxhpbemtcvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843293.5648997-207-201511506843610/AnsiballZ_file.py'
Feb 23 10:41:33 compute-0 sudo[109727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:33 compute-0 python3.9[109730]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:33 compute-0 sudo[109727]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:34 compute-0 sudo[109880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwsmubajenvfdicuxbcjfasppyvyjxou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843294.1272135-207-26190330246006/AnsiballZ_file.py'
Feb 23 10:41:34 compute-0 sudo[109880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:34 compute-0 python3.9[109883]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:34 compute-0 sudo[109880]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:34 compute-0 sudo[110033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmoowugsgregduylrbknjjezxbnibdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843294.6433177-207-75619251610295/AnsiballZ_file.py'
Feb 23 10:41:34 compute-0 sudo[110033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:35 compute-0 python3.9[110036]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:35 compute-0 sudo[110033]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:35 compute-0 sudo[110186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adnkkxrmodcmvxwzptfuzyagqiyhkgah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843295.2733748-207-149252274813538/AnsiballZ_file.py'
Feb 23 10:41:35 compute-0 sudo[110186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:35 compute-0 python3.9[110189]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:35 compute-0 sudo[110186]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:36 compute-0 sudo[110339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbexmthxillhdooutbiknqtwakoqavwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843295.8214014-207-76490568248039/AnsiballZ_file.py'
Feb 23 10:41:36 compute-0 sudo[110339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:36 compute-0 python3.9[110342]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:36 compute-0 sudo[110339]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:36 compute-0 sudo[110492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkwojekuqxbifcxqpcttsstrjurogwdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843296.364455-207-14399695368821/AnsiballZ_file.py'
Feb 23 10:41:36 compute-0 sudo[110492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:36 compute-0 python3.9[110495]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:36 compute-0 sudo[110492]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:36 compute-0 sshd-session[110511]: Connection closed by authenticating user root 143.198.30.3 port 39946 [preauth]
Feb 23 10:41:37 compute-0 sudo[110647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btsonytwfkxisqcikqgbvqpkgxswtzsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843297.4769547-307-168725443950550/AnsiballZ_file.py'
Feb 23 10:41:37 compute-0 sudo[110647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:37 compute-0 python3.9[110650]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:37 compute-0 sudo[110647]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:38 compute-0 sudo[110800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cspvcssovsvrmicklzmgmfgcbldhnyxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843298.0362465-307-201152199012445/AnsiballZ_file.py'
Feb 23 10:41:38 compute-0 sudo[110800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:38 compute-0 python3.9[110803]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:38 compute-0 sudo[110800]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:38 compute-0 sudo[110953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbrniucfdrdoxocowctnzylghhvzuevw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843298.6038623-307-61271811664069/AnsiballZ_file.py'
Feb 23 10:41:38 compute-0 sudo[110953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:39 compute-0 python3.9[110956]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:39 compute-0 sudo[110953]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:39 compute-0 sudo[111106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eauxytmjamjysypttvhsvtbopijyukvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843299.1456835-307-47391853606194/AnsiballZ_file.py'
Feb 23 10:41:39 compute-0 sudo[111106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:39 compute-0 python3.9[111109]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:39 compute-0 sudo[111106]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:39 compute-0 sudo[111259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smexrvbwgcgdmggtbjlrzdvmmnpywqbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843299.7028878-307-208133592026203/AnsiballZ_file.py'
Feb 23 10:41:39 compute-0 sudo[111259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:40 compute-0 python3.9[111262]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:40 compute-0 sudo[111259]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:40 compute-0 sudo[111412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pavgkklsvlxmgwvpxtdmrraobnpgstav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843300.2540023-307-95963340096892/AnsiballZ_file.py'
Feb 23 10:41:40 compute-0 sudo[111412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:40 compute-0 python3.9[111415]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:40 compute-0 sudo[111412]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:41 compute-0 sudo[111565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keuqlzotwefusjqtegvponomutoiseyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843300.7265232-307-241956696267603/AnsiballZ_file.py'
Feb 23 10:41:41 compute-0 sudo[111565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:41 compute-0 python3.9[111568]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:41:41 compute-0 sudo[111565]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:41 compute-0 podman[111569]: 2026-02-23 10:41:41.654401984 +0000 UTC m=+0.364070326 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:41:42 compute-0 sudo[111738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpzwogkgcecqdbpyrqaybvsfeyuwobir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843302.1245098-409-130829929188056/AnsiballZ_command.py'
Feb 23 10:41:42 compute-0 sudo[111738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:42 compute-0 python3.9[111741]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:42 compute-0 sudo[111738]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:43 compute-0 python3.9[111893]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 10:41:43 compute-0 podman[111918]: 2026-02-23 10:41:43.8730803 +0000 UTC m=+0.081484057 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260216, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:41:44 compute-0 sudo[112069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqdvworehigieyieimlmzinynanchdcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843303.8391814-445-61943053566699/AnsiballZ_systemd_service.py'
Feb 23 10:41:44 compute-0 sudo[112069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:44 compute-0 python3.9[112072]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:41:44 compute-0 systemd[1]: Reloading.
Feb 23 10:41:44 compute-0 systemd-sysv-generator[112104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:41:44 compute-0 systemd-rc-local-generator[112101]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:41:44 compute-0 sudo[112069]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:45 compute-0 sudo[112264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oihxlrpagnmeevijdaqqcrobpczszxyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843305.0007594-461-34293553576855/AnsiballZ_command.py'
Feb 23 10:41:45 compute-0 sudo[112264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:45 compute-0 python3.9[112267]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:45 compute-0 sudo[112264]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:45 compute-0 sudo[112418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiqtkfyeordgowanqerxhwwobaxhmvaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843305.522231-461-84807308462700/AnsiballZ_command.py'
Feb 23 10:41:45 compute-0 sudo[112418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:45 compute-0 python3.9[112421]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:45 compute-0 sudo[112418]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:46 compute-0 sudo[112572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axhnsfppuwexcdgeypcyrkljfbdasfvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843306.051583-461-43484008396091/AnsiballZ_command.py'
Feb 23 10:41:46 compute-0 sudo[112572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:46 compute-0 sshd-session[112573]: Connection closed by authenticating user root 165.227.79.48 port 45712 [preauth]
Feb 23 10:41:46 compute-0 python3.9[112577]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:46 compute-0 sudo[112572]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:46 compute-0 sudo[112728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjdkoemdfroblwbptuwwkcglhcaeisqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843306.5342503-461-67247060027323/AnsiballZ_command.py'
Feb 23 10:41:46 compute-0 sudo[112728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:46 compute-0 python3.9[112731]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:46 compute-0 sudo[112728]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:47 compute-0 sudo[112882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpufukxrpyqztkurqmdwlthrcurzfejv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843307.0667777-461-143320690748217/AnsiballZ_command.py'
Feb 23 10:41:47 compute-0 sudo[112882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:47 compute-0 python3.9[112885]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:47 compute-0 sudo[112882]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:47 compute-0 sudo[113036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgrysaaxjzcyiljqbribpzdvralgplzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843307.5315244-461-67209373955858/AnsiballZ_command.py'
Feb 23 10:41:47 compute-0 sudo[113036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:47 compute-0 python3.9[113039]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:47 compute-0 sudo[113036]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:48 compute-0 sudo[113190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuipvhiyxqvjjufjfbltxgrsoixhrpze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843307.9967396-461-158370880655363/AnsiballZ_command.py'
Feb 23 10:41:48 compute-0 sudo[113190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:48 compute-0 python3.9[113193]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:41:48 compute-0 sudo[113190]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:49 compute-0 sudo[113344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyrlghkqjxgctyaynqtgunukjuyuyfwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843309.4524374-569-120295045611308/AnsiballZ_getent.py'
Feb 23 10:41:49 compute-0 sudo[113344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:49 compute-0 python3.9[113347]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 23 10:41:49 compute-0 sudo[113344]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:50 compute-0 sudo[113498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ythplplogakahhgivyqqdyljgnlaqfel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843310.158963-585-73966199544701/AnsiballZ_group.py'
Feb 23 10:41:50 compute-0 sudo[113498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:50 compute-0 python3.9[113501]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 10:41:50 compute-0 groupadd[113502]: group added to /etc/group: name=libvirt, GID=42473
Feb 23 10:41:50 compute-0 groupadd[113502]: group added to /etc/gshadow: name=libvirt
Feb 23 10:41:50 compute-0 groupadd[113502]: new group: name=libvirt, GID=42473
Feb 23 10:41:50 compute-0 sudo[113498]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:51 compute-0 sudo[113657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwvzdajyxnogijrvmhcvysfqcysmouhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843310.9996023-601-196802779347099/AnsiballZ_user.py'
Feb 23 10:41:51 compute-0 sudo[113657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:51 compute-0 python3.9[113660]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 10:41:51 compute-0 useradd[113662]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Feb 23 10:41:51 compute-0 sudo[113657]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:52 compute-0 sudo[113818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icdknvytqpylwxxtlvmlygrudrdeifys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843312.0560796-623-72876713361147/AnsiballZ_setup.py'
Feb 23 10:41:52 compute-0 sudo[113818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:52 compute-0 python3.9[113821]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:41:52 compute-0 sudo[113818]: pam_unix(sudo:session): session closed for user root
Feb 23 10:41:53 compute-0 sudo[113903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjudidowjdhpheplvzyvgcecmndqngke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843312.0560796-623-72876713361147/AnsiballZ_dnf.py'
Feb 23 10:41:53 compute-0 sudo[113903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:41:53 compute-0 python3.9[113906]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:42:10 compute-0 sshd-session[114091]: Connection closed by authenticating user root 143.198.30.3 port 36084 [preauth]
Feb 23 10:42:11 compute-0 podman[114098]: 2026-02-23 10:42:11.839752641 +0000 UTC m=+0.044977710 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:42:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:42:12.619 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:42:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:42:12.620 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:42:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:42:12.620 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:42:14 compute-0 podman[114119]: 2026-02-23 10:42:14.866343614 +0000 UTC m=+0.071288623 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 23 10:42:19 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 23 10:42:19 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:42:19 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 23 10:42:19 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:42:19 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:42:19 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:42:19 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:42:19 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:42:28 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 23 10:42:28 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:42:28 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 23 10:42:28 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:42:28 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:42:28 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:42:28 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:42:28 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:42:33 compute-0 sshd-session[114161]: Connection closed by authenticating user root 165.227.79.48 port 48264 [preauth]
Feb 23 10:42:42 compute-0 dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 23 10:42:42 compute-0 podman[116907]: 2026-02-23 10:42:42.85725701 +0000 UTC m=+0.048159304 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:42:43 compute-0 sshd-session[117728]: Connection closed by authenticating user root 143.198.30.3 port 44472 [preauth]
Feb 23 10:42:45 compute-0 podman[119645]: 2026-02-23 10:42:45.884522653 +0000 UTC m=+0.080866562 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 10:43:09 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 23 10:43:09 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 10:43:09 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 23 10:43:09 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 10:43:09 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 23 10:43:09 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 10:43:09 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 10:43:09 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 10:43:10 compute-0 groupadd[131118]: group added to /etc/group: name=dnsmasq, GID=993
Feb 23 10:43:10 compute-0 groupadd[131118]: group added to /etc/gshadow: name=dnsmasq
Feb 23 10:43:10 compute-0 groupadd[131118]: new group: name=dnsmasq, GID=993
Feb 23 10:43:10 compute-0 useradd[131125]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 23 10:43:10 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 23 10:43:10 compute-0 dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 23 10:43:10 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 23 10:43:11 compute-0 groupadd[131138]: group added to /etc/group: name=clevis, GID=992
Feb 23 10:43:11 compute-0 groupadd[131138]: group added to /etc/gshadow: name=clevis
Feb 23 10:43:11 compute-0 groupadd[131138]: new group: name=clevis, GID=992
Feb 23 10:43:11 compute-0 useradd[131145]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 23 10:43:11 compute-0 usermod[131155]: add 'clevis' to group 'tss'
Feb 23 10:43:11 compute-0 usermod[131155]: add 'clevis' to shadow group 'tss'
Feb 23 10:43:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:43:12.620 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:43:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:43:12.622 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:43:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:43:12.622 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:43:13 compute-0 podman[131183]: 2026-02-23 10:43:13.323119696 +0000 UTC m=+0.067997304 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 23 10:43:13 compute-0 polkitd[45250]: Reloading rules
Feb 23 10:43:13 compute-0 polkitd[45250]: Collecting garbage unconditionally...
Feb 23 10:43:13 compute-0 polkitd[45250]: Loading rules from directory /etc/polkit-1/rules.d
Feb 23 10:43:13 compute-0 polkitd[45250]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 23 10:43:13 compute-0 polkitd[45250]: Finished loading, compiling and executing 3 rules
Feb 23 10:43:13 compute-0 polkitd[45250]: Reloading rules
Feb 23 10:43:13 compute-0 polkitd[45250]: Collecting garbage unconditionally...
Feb 23 10:43:13 compute-0 polkitd[45250]: Loading rules from directory /etc/polkit-1/rules.d
Feb 23 10:43:13 compute-0 polkitd[45250]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 23 10:43:13 compute-0 polkitd[45250]: Finished loading, compiling and executing 3 rules
Feb 23 10:43:14 compute-0 groupadd[131366]: group added to /etc/group: name=ceph, GID=167
Feb 23 10:43:14 compute-0 groupadd[131366]: group added to /etc/gshadow: name=ceph
Feb 23 10:43:14 compute-0 groupadd[131366]: new group: name=ceph, GID=167
Feb 23 10:43:14 compute-0 useradd[131372]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 23 10:43:16 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Feb 23 10:43:16 compute-0 sshd[1018]: Received signal 15; terminating.
Feb 23 10:43:16 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Feb 23 10:43:16 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Feb 23 10:43:16 compute-0 systemd[1]: sshd.service: Consumed 15.252s CPU time, read 564.0K from disk, written 612.0K to disk.
Feb 23 10:43:16 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Feb 23 10:43:16 compute-0 systemd[1]: Stopping sshd-keygen.target...
Feb 23 10:43:16 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 10:43:16 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 10:43:16 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 10:43:16 compute-0 systemd[1]: Reached target sshd-keygen.target.
Feb 23 10:43:16 compute-0 systemd[1]: Starting OpenSSH server daemon...
Feb 23 10:43:16 compute-0 sshd[131901]: Server listening on 0.0.0.0 port 22.
Feb 23 10:43:16 compute-0 sshd[131901]: Server listening on :: port 22.
Feb 23 10:43:16 compute-0 systemd[1]: Started OpenSSH server daemon.
Feb 23 10:43:16 compute-0 podman[131889]: 2026-02-23 10:43:16.730188613 +0000 UTC m=+0.083634285 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 23 10:43:17 compute-0 sshd-session[132011]: Connection closed by authenticating user root 143.198.30.3 port 52758 [preauth]
Feb 23 10:43:17 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:43:17 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:43:17 compute-0 systemd[1]: Reloading.
Feb 23 10:43:17 compute-0 systemd-rc-local-generator[132169]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:17 compute-0 systemd-sysv-generator[132176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:18 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:43:20 compute-0 sudo[113903]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:20 compute-0 sudo[137298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlccnvwthkiffjundodnmqlwlvfjgdqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843400.27103-647-238226948685101/AnsiballZ_systemd.py'
Feb 23 10:43:20 compute-0 sudo[137298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:21 compute-0 python3.9[137328]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:43:21 compute-0 systemd[1]: Reloading.
Feb 23 10:43:21 compute-0 systemd-rc-local-generator[137944]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:21 compute-0 systemd-sysv-generator[137950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:21 compute-0 sudo[137298]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:21 compute-0 sshd-session[138626]: Connection closed by authenticating user root 165.227.79.48 port 54502 [preauth]
Feb 23 10:43:21 compute-0 sudo[139018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlhgmnrbwfslzzbewtdwwjqrxpwdbfjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843401.5468547-647-276666733861651/AnsiballZ_systemd.py'
Feb 23 10:43:21 compute-0 sudo[139018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:22 compute-0 python3.9[139044]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:43:22 compute-0 systemd[1]: Reloading.
Feb 23 10:43:22 compute-0 systemd-sysv-generator[139613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:22 compute-0 systemd-rc-local-generator[139610]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:22 compute-0 sudo[139018]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:22 compute-0 sudo[140538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsjilumiiyyejfxhnvbpbpnoxykiuibk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843402.5353801-647-47589994707721/AnsiballZ_systemd.py'
Feb 23 10:43:22 compute-0 sudo[140538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:23 compute-0 python3.9[140577]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:43:23 compute-0 systemd[1]: Reloading.
Feb 23 10:43:23 compute-0 systemd-rc-local-generator[141115]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:23 compute-0 systemd-sysv-generator[141118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:23 compute-0 sudo[140538]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:23 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:43:23 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:43:23 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.905s CPU time.
Feb 23 10:43:23 compute-0 systemd[1]: run-rfb85096b94ae42bea4405c7b4bcc979c.service: Deactivated successfully.
Feb 23 10:43:23 compute-0 sudo[141333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xasebeusicbwfgiuorpnkbsdxkxwnzep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843403.4722214-647-67872119246618/AnsiballZ_systemd.py'
Feb 23 10:43:23 compute-0 sudo[141333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:23 compute-0 python3.9[141336]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:43:24 compute-0 systemd[1]: Reloading.
Feb 23 10:43:24 compute-0 systemd-rc-local-generator[141365]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:24 compute-0 systemd-sysv-generator[141370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:24 compute-0 sudo[141333]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:24 compute-0 sudo[141531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdywwfkabmskvmsrenqniljshyrkmedt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843404.6098833-705-16630607570459/AnsiballZ_systemd.py'
Feb 23 10:43:24 compute-0 sudo[141531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:25 compute-0 python3.9[141534]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:25 compute-0 systemd[1]: Reloading.
Feb 23 10:43:25 compute-0 systemd-rc-local-generator[141566]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:25 compute-0 systemd-sysv-generator[141571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:25 compute-0 sudo[141531]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:25 compute-0 sudo[141729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swmekcdzyckwyajovzniyuqojmnfglsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843405.6026676-705-10142433363504/AnsiballZ_systemd.py'
Feb 23 10:43:25 compute-0 sudo[141729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:26 compute-0 python3.9[141732]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:26 compute-0 systemd[1]: Reloading.
Feb 23 10:43:26 compute-0 systemd-sysv-generator[141765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:26 compute-0 systemd-rc-local-generator[141760]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:26 compute-0 sudo[141729]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:26 compute-0 sudo[141927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewbhzltbxlqgdmjifqdxzpdxwyvslhgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843406.509526-705-187248398044161/AnsiballZ_systemd.py'
Feb 23 10:43:26 compute-0 sudo[141927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:27 compute-0 python3.9[141930]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:27 compute-0 systemd[1]: Reloading.
Feb 23 10:43:27 compute-0 systemd-rc-local-generator[141956]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:27 compute-0 systemd-sysv-generator[141959]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:27 compute-0 sudo[141927]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:27 compute-0 sudo[142125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-facjkehkznusvvdyjonnatmitnrmanzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843407.3810234-705-130412636940391/AnsiballZ_systemd.py'
Feb 23 10:43:27 compute-0 sudo[142125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:27 compute-0 python3.9[142128]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:27 compute-0 sudo[142125]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:28 compute-0 sudo[142281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khpvafdbjlhuydychzcmlfdeomvzltrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843408.0889184-705-110954317390611/AnsiballZ_systemd.py'
Feb 23 10:43:28 compute-0 sudo[142281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:28 compute-0 python3.9[142284]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:28 compute-0 systemd[1]: Reloading.
Feb 23 10:43:28 compute-0 systemd-sysv-generator[142317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:28 compute-0 systemd-rc-local-generator[142313]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:28 compute-0 sudo[142281]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:29 compute-0 sudo[142479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xchebebpcxogtvarqrexxiqlbbqcmmrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843409.6319087-777-139599393926321/AnsiballZ_systemd.py'
Feb 23 10:43:29 compute-0 sudo[142479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:30 compute-0 python3.9[142482]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 10:43:30 compute-0 systemd[1]: Reloading.
Feb 23 10:43:30 compute-0 systemd-rc-local-generator[142512]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:43:30 compute-0 systemd-sysv-generator[142515]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:43:30 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 23 10:43:30 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 23 10:43:30 compute-0 sudo[142479]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:30 compute-0 sudo[142680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymiaelelkburpedclqxotuepudehlpgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843410.6444798-793-239977909682978/AnsiballZ_systemd.py'
Feb 23 10:43:30 compute-0 sudo[142680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:31 compute-0 python3.9[142683]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:31 compute-0 sudo[142680]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:31 compute-0 sudo[142836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqvzklfgsdmoxfzfkfalekyekmujjmcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843411.4156373-793-276882670108171/AnsiballZ_systemd.py'
Feb 23 10:43:31 compute-0 sudo[142836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:31 compute-0 python3.9[142839]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:31 compute-0 sudo[142836]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:32 compute-0 sudo[142992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrbckhaqeocooibkxhmhzjfyzapsqthc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843412.0956104-793-80257338183995/AnsiballZ_systemd.py'
Feb 23 10:43:32 compute-0 sudo[142992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:32 compute-0 python3.9[142995]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:33 compute-0 sudo[142992]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:34 compute-0 sudo[143148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzvnwayrfxijnscikxbgloduftpwpfck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843413.8124776-793-280822447922918/AnsiballZ_systemd.py'
Feb 23 10:43:34 compute-0 sudo[143148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:34 compute-0 python3.9[143151]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:34 compute-0 sudo[143148]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:34 compute-0 sudo[143304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moikujmtuknoriecqyupqccsgpybknls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843414.6051059-793-146160910915180/AnsiballZ_systemd.py'
Feb 23 10:43:34 compute-0 sudo[143304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:35 compute-0 python3.9[143307]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:35 compute-0 sudo[143304]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:35 compute-0 sudo[143460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lglvtsrgcypszlaoypjxedyutbutfrus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843415.3288615-793-276699123165719/AnsiballZ_systemd.py'
Feb 23 10:43:35 compute-0 sudo[143460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:35 compute-0 python3.9[143463]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:35 compute-0 sudo[143460]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:36 compute-0 sudo[143616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suqwjjgrbwquwbatpejlyduroeauhkwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843416.0393052-793-63649429927012/AnsiballZ_systemd.py'
Feb 23 10:43:36 compute-0 sudo[143616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:36 compute-0 python3.9[143619]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:36 compute-0 sudo[143616]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:36 compute-0 sudo[143772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czujjwihhrzjrbyyahkjkliceomqyiiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843416.6974614-793-79161203723512/AnsiballZ_systemd.py'
Feb 23 10:43:36 compute-0 sudo[143772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:37 compute-0 python3.9[143775]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:37 compute-0 sudo[143772]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:37 compute-0 sudo[143928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdjuuzfqggemhxcbsxoeqmfakxstsdou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843417.444228-793-216769107145660/AnsiballZ_systemd.py'
Feb 23 10:43:37 compute-0 sudo[143928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:37 compute-0 python3.9[143931]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:38 compute-0 sudo[143928]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:38 compute-0 sudo[144084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zukdokxhjejvrfikblhoqpdfyyozrxbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843418.1542616-793-169655975620555/AnsiballZ_systemd.py'
Feb 23 10:43:38 compute-0 sudo[144084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:38 compute-0 python3.9[144087]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:38 compute-0 sudo[144084]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:39 compute-0 sudo[144240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhkqqmnhpamyefmzwxxlxbjfazgxkfaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843418.9085581-793-193878835607528/AnsiballZ_systemd.py'
Feb 23 10:43:39 compute-0 sudo[144240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:39 compute-0 python3.9[144243]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:39 compute-0 sudo[144240]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:39 compute-0 sudo[144396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsevtuqfwktyjcfvdenkqfhpimzcmofy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843419.618584-793-213643022224006/AnsiballZ_systemd.py'
Feb 23 10:43:39 compute-0 sudo[144396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:40 compute-0 python3.9[144399]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:40 compute-0 sudo[144396]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:40 compute-0 sudo[144552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pasezhedjzdsdgcscgrsttxctqmvzrpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843420.3210056-793-271053422256535/AnsiballZ_systemd.py'
Feb 23 10:43:40 compute-0 sudo[144552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:40 compute-0 python3.9[144555]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:40 compute-0 sudo[144552]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:41 compute-0 sudo[144708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmwlchmpbhniuiqhqeoisrgzthsiiipc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843421.0040383-793-55172890028298/AnsiballZ_systemd.py'
Feb 23 10:43:41 compute-0 sudo[144708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:41 compute-0 python3.9[144711]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 10:43:41 compute-0 sudo[144708]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:42 compute-0 sudo[144864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ienfuryolwlkmgitetjcmzldaaeujhaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843422.2129035-997-177021455634510/AnsiballZ_file.py'
Feb 23 10:43:42 compute-0 sudo[144864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:42 compute-0 python3.9[144867]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:43:42 compute-0 sudo[144864]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:42 compute-0 sudo[145017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdtoupdwwajuvxiwjsuosrmyfykjjqmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843422.7529917-997-244668183632507/AnsiballZ_file.py'
Feb 23 10:43:42 compute-0 sudo[145017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:43 compute-0 python3.9[145020]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:43:43 compute-0 sudo[145017]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:43 compute-0 sudo[145183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtnxqiafgstyrmiuyrmlbvbttpxqmzwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843423.2992473-997-230289212238939/AnsiballZ_file.py'
Feb 23 10:43:43 compute-0 sudo[145183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:43 compute-0 podman[145144]: 2026-02-23 10:43:43.610548067 +0000 UTC m=+0.085432518 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 10:43:43 compute-0 python3.9[145192]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:43:43 compute-0 sudo[145183]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:44 compute-0 sudo[145342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyfrbiogtulmuyffmgsfmcxsjvphmnfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843423.8670692-997-225428680396932/AnsiballZ_file.py'
Feb 23 10:43:44 compute-0 sudo[145342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:44 compute-0 python3.9[145345]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:43:44 compute-0 sudo[145342]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:44 compute-0 sudo[145495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nybvsahrfygmtneqtirbbnfqcaidrtak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843424.4134054-997-131327440616457/AnsiballZ_file.py'
Feb 23 10:43:44 compute-0 sudo[145495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:44 compute-0 python3.9[145498]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:43:44 compute-0 sudo[145495]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:45 compute-0 sudo[145648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnslcexsgarhitjrmsizqhgukjbwlyrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843424.9570274-997-222680713952538/AnsiballZ_file.py'
Feb 23 10:43:45 compute-0 sudo[145648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:45 compute-0 python3.9[145651]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:43:45 compute-0 sudo[145648]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:46 compute-0 python3.9[145801]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:43:46 compute-0 podman[145847]: 2026-02-23 10:43:46.956793467 +0000 UTC m=+0.158154485 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 10:43:47 compute-0 sudo[145975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tabdlqfcfyybqcygemifwrpnelpiqawu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843426.788799-1099-58196187241940/AnsiballZ_stat.py'
Feb 23 10:43:47 compute-0 sudo[145975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:47 compute-0 python3.9[145978]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:47 compute-0 sudo[145975]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:47 compute-0 sudo[146101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csnjgsgmyciiukmulzxpefxubgnvwrdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843426.788799-1099-58196187241940/AnsiballZ_copy.py'
Feb 23 10:43:47 compute-0 sudo[146101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:48 compute-0 python3.9[146104]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843426.788799-1099-58196187241940/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:48 compute-0 sudo[146101]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:48 compute-0 sudo[146254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaymkwukscuqginwtiornonhzgszkveb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843428.2693346-1099-71515576398878/AnsiballZ_stat.py'
Feb 23 10:43:48 compute-0 sudo[146254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:48 compute-0 python3.9[146257]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:48 compute-0 sudo[146254]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:49 compute-0 sudo[146380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqmqcjxgucmicqjerwhopaftruzdmjkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843428.2693346-1099-71515576398878/AnsiballZ_copy.py'
Feb 23 10:43:49 compute-0 sudo[146380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:49 compute-0 python3.9[146383]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843428.2693346-1099-71515576398878/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:49 compute-0 sudo[146380]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:49 compute-0 sudo[146535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fveyhjufhswulnbfvrtwvcnhjfzsgroq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843429.4048455-1099-18151992913288/AnsiballZ_stat.py'
Feb 23 10:43:49 compute-0 sudo[146535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:49 compute-0 sshd-session[146483]: Connection closed by authenticating user root 143.198.30.3 port 40636 [preauth]
Feb 23 10:43:49 compute-0 python3.9[146538]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:49 compute-0 sudo[146535]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:50 compute-0 sudo[146661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sathraygfmobjfaiiadizufvjapynopo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843429.4048455-1099-18151992913288/AnsiballZ_copy.py'
Feb 23 10:43:50 compute-0 sudo[146661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:50 compute-0 python3.9[146664]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843429.4048455-1099-18151992913288/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:50 compute-0 sudo[146661]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:50 compute-0 sudo[146814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phajmipkltdwioxesogcssxenjlmuyla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843430.4924994-1099-104307658288932/AnsiballZ_stat.py'
Feb 23 10:43:50 compute-0 sudo[146814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:50 compute-0 python3.9[146817]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:50 compute-0 sudo[146814]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:51 compute-0 sudo[146940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjypmfgnekxpgcnygiodetngrsiokvgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843430.4924994-1099-104307658288932/AnsiballZ_copy.py'
Feb 23 10:43:51 compute-0 sudo[146940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:51 compute-0 python3.9[146944]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843430.4924994-1099-104307658288932/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:51 compute-0 sudo[146940]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:51 compute-0 sudo[147094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbwhqtepszshrrohamrrhylzyzrqeuov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843431.5964148-1099-145615926053000/AnsiballZ_stat.py'
Feb 23 10:43:51 compute-0 sudo[147094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:52 compute-0 python3.9[147097]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:52 compute-0 sudo[147094]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:52 compute-0 sudo[147220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oihpabkxocffpqtpbdxhjxyqozqiqsgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843431.5964148-1099-145615926053000/AnsiballZ_copy.py'
Feb 23 10:43:52 compute-0 sudo[147220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:52 compute-0 python3.9[147223]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843431.5964148-1099-145615926053000/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:52 compute-0 sudo[147220]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:52 compute-0 sudo[147373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmelzyvlymolgwbbsxdcutsuiecnamyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843432.7466176-1099-20141104512835/AnsiballZ_stat.py'
Feb 23 10:43:52 compute-0 sudo[147373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:53 compute-0 python3.9[147376]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:53 compute-0 sudo[147373]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:53 compute-0 sudo[147499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymgebqmvghpamnlsqpvzvjxycqrrtkyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843432.7466176-1099-20141104512835/AnsiballZ_copy.py'
Feb 23 10:43:53 compute-0 sudo[147499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:53 compute-0 python3.9[147502]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843432.7466176-1099-20141104512835/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:53 compute-0 sudo[147499]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:53 compute-0 sudo[147652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvdlufixaqjymhitvxulwlxorlnhzoob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843433.7898679-1099-55309631170794/AnsiballZ_stat.py'
Feb 23 10:43:53 compute-0 sudo[147652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:54 compute-0 python3.9[147655]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:54 compute-0 sudo[147652]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:54 compute-0 sudo[147776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxqlgetdbxlilmyhskandleughxycdrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843433.7898679-1099-55309631170794/AnsiballZ_copy.py'
Feb 23 10:43:54 compute-0 sudo[147776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:54 compute-0 python3.9[147779]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843433.7898679-1099-55309631170794/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:54 compute-0 sudo[147776]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:55 compute-0 sudo[147929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kneegwlfrctppdwkhuomemzfjycbaygi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843434.8360715-1099-228131733301485/AnsiballZ_stat.py'
Feb 23 10:43:55 compute-0 sudo[147929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:55 compute-0 python3.9[147932]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:43:55 compute-0 sudo[147929]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:55 compute-0 sudo[148055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzvjbktbgxvsqugpaghcpjhmbntfutum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843434.8360715-1099-228131733301485/AnsiballZ_copy.py'
Feb 23 10:43:55 compute-0 sudo[148055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:55 compute-0 python3.9[148058]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771843434.8360715-1099-228131733301485/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:55 compute-0 sudo[148055]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:56 compute-0 sudo[148208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvcnouyjtwiicqgdrjfjpujakdibstqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843436.5373843-1325-102399189228815/AnsiballZ_command.py'
Feb 23 10:43:56 compute-0 sudo[148208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:57 compute-0 python3.9[148211]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 23 10:43:57 compute-0 sudo[148208]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:57 compute-0 sudo[148362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmfaqxfbtimlnruactbmvoipslzvaudf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843437.3052588-1343-187886716349697/AnsiballZ_file.py'
Feb 23 10:43:57 compute-0 sudo[148362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:57 compute-0 python3.9[148365]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:57 compute-0 sudo[148362]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:58 compute-0 sudo[148515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xshwuieypdoxodvxpqxntzjlqcrbptqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843437.8905509-1343-114350402260271/AnsiballZ_file.py'
Feb 23 10:43:58 compute-0 sudo[148515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:58 compute-0 python3.9[148518]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:58 compute-0 sudo[148515]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:58 compute-0 sudo[148668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyhviytrrjeujxgsieumpcxqmlpohbmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843438.4444172-1343-193954825570760/AnsiballZ_file.py'
Feb 23 10:43:58 compute-0 sudo[148668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:58 compute-0 python3.9[148671]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:58 compute-0 sudo[148668]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:59 compute-0 sudo[148821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plwlthguiautlxsujfvmdzhdwsmwcvvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843439.032539-1343-59944925655272/AnsiballZ_file.py'
Feb 23 10:43:59 compute-0 sudo[148821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:59 compute-0 python3.9[148824]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:59 compute-0 sudo[148821]: pam_unix(sudo:session): session closed for user root
Feb 23 10:43:59 compute-0 sudo[148974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyjfcsqssjphmleejhbftjqrkoqjxtfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843439.5667691-1343-64491757076015/AnsiballZ_file.py'
Feb 23 10:43:59 compute-0 sudo[148974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:43:59 compute-0 python3.9[148977]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:43:59 compute-0 sudo[148974]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:00 compute-0 sudo[149127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjdqvfpoamdsiqdiuzeextyilcimllya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843440.0951328-1343-38897143325332/AnsiballZ_file.py'
Feb 23 10:44:00 compute-0 sudo[149127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:00 compute-0 python3.9[149130]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:00 compute-0 sudo[149127]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:00 compute-0 sudo[149280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idhknjucwqtovwcfkariswmezlnnxoey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843440.6657987-1343-167971197920109/AnsiballZ_file.py'
Feb 23 10:44:00 compute-0 sudo[149280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:01 compute-0 python3.9[149283]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:01 compute-0 sudo[149280]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:01 compute-0 sudo[149433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgdhyenfojwjmxqrwmdynrdkercsedrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843441.242981-1343-204567918388712/AnsiballZ_file.py'
Feb 23 10:44:01 compute-0 sudo[149433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:01 compute-0 python3.9[149436]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:01 compute-0 sudo[149433]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:02 compute-0 sudo[149586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmtycubylcduinjvowqzkktuyxcfvgat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843441.8204691-1343-165815162531713/AnsiballZ_file.py'
Feb 23 10:44:02 compute-0 sudo[149586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:02 compute-0 python3.9[149589]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:02 compute-0 sudo[149586]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:02 compute-0 sudo[149739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxdwwichivewpgyhilglijuvlebezlip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843442.3851855-1343-166488301925755/AnsiballZ_file.py'
Feb 23 10:44:02 compute-0 sudo[149739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:02 compute-0 python3.9[149742]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:02 compute-0 sudo[149739]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:03 compute-0 sudo[149892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olfwniffwzzrmcnhskedjcsnrfgleigb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843442.9134607-1343-74205622249233/AnsiballZ_file.py'
Feb 23 10:44:03 compute-0 sudo[149892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:03 compute-0 python3.9[149895]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:03 compute-0 sudo[149892]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:03 compute-0 sudo[150045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiiihlqfvhbaujndobpleyyvdyflqweb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843443.5268643-1343-201910299967201/AnsiballZ_file.py'
Feb 23 10:44:03 compute-0 sudo[150045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:03 compute-0 python3.9[150048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:04 compute-0 sudo[150045]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:04 compute-0 sudo[150198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gueumosteuaymjsjcbmninuzylyadygj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843444.1337988-1343-47588276283946/AnsiballZ_file.py'
Feb 23 10:44:04 compute-0 sudo[150198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:04 compute-0 python3.9[150201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:04 compute-0 sudo[150198]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:04 compute-0 sudo[150351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvpdmloryvbrqxgiyykabmhdzestldnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843444.6771107-1343-141626556146312/AnsiballZ_file.py'
Feb 23 10:44:04 compute-0 sudo[150351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:05 compute-0 python3.9[150354]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:05 compute-0 sudo[150351]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:06 compute-0 sudo[150504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edbhgcgilnvyrtaiwduhfkhrhdgratdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843446.4906185-1541-194219879027497/AnsiballZ_stat.py'
Feb 23 10:44:06 compute-0 sudo[150504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:06 compute-0 python3.9[150507]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:06 compute-0 sudo[150504]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:07 compute-0 sudo[150628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onjrbuzghvjjmixhdwjgjobfngolkpsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843446.4906185-1541-194219879027497/AnsiballZ_copy.py'
Feb 23 10:44:07 compute-0 sudo[150628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:07 compute-0 python3.9[150631]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843446.4906185-1541-194219879027497/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:07 compute-0 sudo[150628]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:07 compute-0 sudo[150781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwstvfunmgspehugeokzzbhhruacbgll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843447.5691597-1541-34700735529457/AnsiballZ_stat.py'
Feb 23 10:44:07 compute-0 sudo[150781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:08 compute-0 python3.9[150784]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:08 compute-0 sudo[150781]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:08 compute-0 sudo[150905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgosyrhoybzjahwwqpeimbmlmybnkonv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843447.5691597-1541-34700735529457/AnsiballZ_copy.py'
Feb 23 10:44:08 compute-0 sudo[150905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:08 compute-0 python3.9[150908]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843447.5691597-1541-34700735529457/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:08 compute-0 sudo[150905]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:08 compute-0 sudo[151058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukqholkesfyzzqkqvghmnmgjtgdiksks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843448.6748605-1541-185357348283611/AnsiballZ_stat.py'
Feb 23 10:44:08 compute-0 sudo[151058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:09 compute-0 python3.9[151061]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:09 compute-0 sudo[151058]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:09 compute-0 sshd-session[151132]: Connection closed by authenticating user root 165.227.79.48 port 51526 [preauth]
Feb 23 10:44:09 compute-0 sudo[151184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmbulbexgbsfxwdxxrekxrfxqxhxiydb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843448.6748605-1541-185357348283611/AnsiballZ_copy.py'
Feb 23 10:44:09 compute-0 sudo[151184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:09 compute-0 python3.9[151187]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843448.6748605-1541-185357348283611/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:09 compute-0 sudo[151184]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:09 compute-0 sudo[151337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhyejvlsaxlbqvozvmsvcgzqzxxgxhxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843449.7140462-1541-201562121283474/AnsiballZ_stat.py'
Feb 23 10:44:09 compute-0 sudo[151337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:10 compute-0 python3.9[151340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:10 compute-0 sudo[151337]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:10 compute-0 sudo[151461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-timvowipgvwgrfxfgcipozpycvnmnmnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843449.7140462-1541-201562121283474/AnsiballZ_copy.py'
Feb 23 10:44:10 compute-0 sudo[151461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:10 compute-0 python3.9[151464]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843449.7140462-1541-201562121283474/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:10 compute-0 sudo[151461]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:11 compute-0 sudo[151614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihowjcbjzjdguehewclhlqxxyffxfuas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843450.8110852-1541-250607321656338/AnsiballZ_stat.py'
Feb 23 10:44:11 compute-0 sudo[151614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:11 compute-0 python3.9[151617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:11 compute-0 sudo[151614]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:11 compute-0 sudo[151738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvppyfybaqyqeanzgjoajtxyghczdwwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843450.8110852-1541-250607321656338/AnsiballZ_copy.py'
Feb 23 10:44:11 compute-0 sudo[151738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:11 compute-0 python3.9[151741]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843450.8110852-1541-250607321656338/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:11 compute-0 sudo[151738]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:12 compute-0 sudo[151891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdkymlwoaijcjitjkptjnbhecfzgjudr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843451.7990792-1541-48339217813438/AnsiballZ_stat.py'
Feb 23 10:44:12 compute-0 sudo[151891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:12 compute-0 python3.9[151894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:12 compute-0 sudo[151891]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:12 compute-0 sudo[152015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvwenvuwdljfvwfikyjtxgriipdfauji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843451.7990792-1541-48339217813438/AnsiballZ_copy.py'
Feb 23 10:44:12 compute-0 sudo[152015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:44:12.621 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:44:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:44:12.621 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:44:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:44:12.621 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:44:12 compute-0 python3.9[152018]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843451.7990792-1541-48339217813438/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:12 compute-0 sudo[152015]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:13 compute-0 sudo[152168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iczewgcakucljlcxywmfjqnrasgnsjfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843452.9029777-1541-80255171749559/AnsiballZ_stat.py'
Feb 23 10:44:13 compute-0 sudo[152168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:13 compute-0 python3.9[152171]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:13 compute-0 sudo[152168]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:13 compute-0 sudo[152305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkvjosoxlphlriucebgcynemglxpucfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843452.9029777-1541-80255171749559/AnsiballZ_copy.py'
Feb 23 10:44:13 compute-0 sudo[152305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:13 compute-0 podman[152266]: 2026-02-23 10:44:13.776423384 +0000 UTC m=+0.062062502 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:44:13 compute-0 python3.9[152310]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843452.9029777-1541-80255171749559/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:13 compute-0 sudo[152305]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:14 compute-0 sudo[152464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnyemtaxhbmlnfgtxbxidzsdkoubjyfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843454.0848975-1541-85616931843305/AnsiballZ_stat.py'
Feb 23 10:44:14 compute-0 sudo[152464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:14 compute-0 python3.9[152467]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:14 compute-0 sudo[152464]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:14 compute-0 sudo[152588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rojmgrbbqatmgrlstekmyulkocrghqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843454.0848975-1541-85616931843305/AnsiballZ_copy.py'
Feb 23 10:44:14 compute-0 sudo[152588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:14 compute-0 python3.9[152591]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843454.0848975-1541-85616931843305/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:14 compute-0 sudo[152588]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:15 compute-0 sudo[152741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-useincthvqlrxsonmrfipgzoictmcwhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843455.129397-1541-5810879141129/AnsiballZ_stat.py'
Feb 23 10:44:15 compute-0 sudo[152741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:15 compute-0 python3.9[152744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:15 compute-0 sudo[152741]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:15 compute-0 sudo[152865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riydxhzynpulzfzlvrhpqzlfdumiirev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843455.129397-1541-5810879141129/AnsiballZ_copy.py'
Feb 23 10:44:15 compute-0 sudo[152865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:16 compute-0 python3.9[152868]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843455.129397-1541-5810879141129/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:16 compute-0 sudo[152865]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:16 compute-0 sudo[153018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yflfrjddqicnlqwgtvgrmfrckaqvksqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843456.2230384-1541-183097050293022/AnsiballZ_stat.py'
Feb 23 10:44:16 compute-0 sudo[153018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:16 compute-0 python3.9[153021]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:16 compute-0 sudo[153018]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:17 compute-0 sudo[153152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvaftzjuhefqerrdorahrnvlineqvumf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843456.2230384-1541-183097050293022/AnsiballZ_copy.py'
Feb 23 10:44:17 compute-0 sudo[153152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:17 compute-0 podman[153116]: 2026-02-23 10:44:17.19227155 +0000 UTC m=+0.128124456 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 10:44:17 compute-0 python3.9[153160]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843456.2230384-1541-183097050293022/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:17 compute-0 sudo[153152]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:17 compute-0 sudo[153319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfnbrloeietizvswwkxbpmiegcdgfijp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843457.4008272-1541-119584527514623/AnsiballZ_stat.py'
Feb 23 10:44:17 compute-0 sudo[153319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:17 compute-0 python3.9[153322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:17 compute-0 sudo[153319]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:18 compute-0 sudo[153443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwpjahnozldaomrjumqrxeotgekboscp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843457.4008272-1541-119584527514623/AnsiballZ_copy.py'
Feb 23 10:44:18 compute-0 sudo[153443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:18 compute-0 python3.9[153446]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843457.4008272-1541-119584527514623/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:18 compute-0 sudo[153443]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:18 compute-0 sudo[153596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbggcxmaxvgqlrdjawaiumrlvrjrjklh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843458.4227023-1541-155573505865276/AnsiballZ_stat.py'
Feb 23 10:44:18 compute-0 sudo[153596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:18 compute-0 python3.9[153599]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:18 compute-0 sudo[153596]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:19 compute-0 sudo[153720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifqpzpijbcbawdeutxiyfkzwyfcakzcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843458.4227023-1541-155573505865276/AnsiballZ_copy.py'
Feb 23 10:44:19 compute-0 sudo[153720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:19 compute-0 python3.9[153723]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843458.4227023-1541-155573505865276/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:19 compute-0 sudo[153720]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:19 compute-0 sudo[153873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcgehwdnilqrvpvlythrvjcjqjqlvikc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843459.4622316-1541-212533108422891/AnsiballZ_stat.py'
Feb 23 10:44:19 compute-0 sudo[153873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:19 compute-0 python3.9[153876]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:19 compute-0 sudo[153873]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:20 compute-0 sudo[153997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chjsdwcyiogxnpredunahtvgcwjcuecb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843459.4622316-1541-212533108422891/AnsiballZ_copy.py'
Feb 23 10:44:20 compute-0 sudo[153997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:20 compute-0 python3.9[154000]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843459.4622316-1541-212533108422891/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:20 compute-0 sudo[153997]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:20 compute-0 sudo[154150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzuxktrkmxhbadkfepdbdhjqsylmjnis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843460.5333436-1541-75687160316604/AnsiballZ_stat.py'
Feb 23 10:44:20 compute-0 sudo[154150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:20 compute-0 python3.9[154153]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:21 compute-0 sudo[154150]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:21 compute-0 sudo[154274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blmbpyddqhyhmemqexepmtdlnlstebqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843460.5333436-1541-75687160316604/AnsiballZ_copy.py'
Feb 23 10:44:21 compute-0 sudo[154274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:21 compute-0 python3.9[154277]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843460.5333436-1541-75687160316604/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:21 compute-0 sudo[154274]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:22 compute-0 sshd-session[154302]: Connection closed by authenticating user root 143.198.30.3 port 49578 [preauth]
Feb 23 10:44:22 compute-0 python3.9[154429]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:44:23 compute-0 sudo[154582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzsuyblfllblmqnpbovnsaxkprpertsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843463.3679783-1953-186300427173585/AnsiballZ_seboolean.py'
Feb 23 10:44:23 compute-0 sudo[154582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:23 compute-0 python3.9[154585]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 23 10:44:24 compute-0 sudo[154582]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:25 compute-0 sudo[154739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdpaujhzwhsthsdlgaesnnrtgheolrqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843465.0698292-1969-186146358183736/AnsiballZ_copy.py'
Feb 23 10:44:25 compute-0 dbus-broker-launch[791]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 23 10:44:25 compute-0 sudo[154739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:25 compute-0 python3.9[154742]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:25 compute-0 sudo[154739]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:25 compute-0 sudo[154892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrmwmszwsatyuqxsxtizvaqyhipatmlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843465.5782623-1969-261384939719506/AnsiballZ_copy.py'
Feb 23 10:44:25 compute-0 sudo[154892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:25 compute-0 python3.9[154895]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:25 compute-0 sudo[154892]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:26 compute-0 sudo[155045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjtsmctaeulvcufebskxzllubkexrtym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843466.0708535-1969-274967483681734/AnsiballZ_copy.py'
Feb 23 10:44:26 compute-0 sudo[155045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:26 compute-0 python3.9[155048]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:26 compute-0 sudo[155045]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:27 compute-0 sudo[155198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwnituvmkrduuveqbsrxyzgpepdzdrkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843466.6321256-1969-41105890446051/AnsiballZ_copy.py'
Feb 23 10:44:27 compute-0 sudo[155198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:27 compute-0 python3.9[155201]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:27 compute-0 sudo[155198]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:27 compute-0 sudo[155351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyuuldvdetinqklbxamuazlnxeoapanu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843467.3992822-1969-209789824178980/AnsiballZ_copy.py'
Feb 23 10:44:27 compute-0 sudo[155351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:27 compute-0 python3.9[155354]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:27 compute-0 sudo[155351]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:28 compute-0 sudo[155504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrbwaufbxuyloryvfbtozqelvcgpxeqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843468.3239286-2041-195653592338560/AnsiballZ_copy.py'
Feb 23 10:44:28 compute-0 sudo[155504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:28 compute-0 python3.9[155507]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:28 compute-0 sudo[155504]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:29 compute-0 sudo[155657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kweawlmhhvvgatxtxnkmzytjyogxrnrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843468.8988914-2041-1818738845417/AnsiballZ_copy.py'
Feb 23 10:44:29 compute-0 sudo[155657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:29 compute-0 python3.9[155660]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:29 compute-0 sudo[155657]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:29 compute-0 sudo[155810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqycultmzbsmhihcpynydtqrslohckrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843469.3871696-2041-204139104242136/AnsiballZ_copy.py'
Feb 23 10:44:29 compute-0 sudo[155810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:29 compute-0 python3.9[155813]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:29 compute-0 sudo[155810]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:30 compute-0 sudo[155963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtaqnpfohbblgtrpbewiegmyrymlyyxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843469.976366-2041-9701717870895/AnsiballZ_copy.py'
Feb 23 10:44:30 compute-0 sudo[155963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:30 compute-0 python3.9[155966]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:30 compute-0 sudo[155963]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:30 compute-0 sudo[156116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vszxiewwncyongiszwzczazqflaguewr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843470.4859996-2041-78498485281810/AnsiballZ_copy.py'
Feb 23 10:44:30 compute-0 sudo[156116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:30 compute-0 python3.9[156119]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:30 compute-0 sudo[156116]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:31 compute-0 sudo[156269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlejxhjkjidrmepohvucjyhmjcgmavgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843471.6611605-2113-252398893518747/AnsiballZ_systemd.py'
Feb 23 10:44:31 compute-0 sudo[156269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:32 compute-0 python3.9[156272]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:44:32 compute-0 systemd[1]: Reloading.
Feb 23 10:44:32 compute-0 systemd-rc-local-generator[156290]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:44:32 compute-0 systemd-sysv-generator[156297]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:44:32 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Feb 23 10:44:32 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Feb 23 10:44:32 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 23 10:44:32 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 23 10:44:32 compute-0 systemd[1]: Starting libvirt logging daemon...
Feb 23 10:44:32 compute-0 systemd[1]: Started libvirt logging daemon.
Feb 23 10:44:32 compute-0 sudo[156269]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:32 compute-0 sudo[156469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iejyzvvtrytmplajgdylsjazjxsvfqrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843472.618069-2113-196646302399670/AnsiballZ_systemd.py'
Feb 23 10:44:32 compute-0 sudo[156469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:33 compute-0 python3.9[156472]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:44:33 compute-0 systemd[1]: Reloading.
Feb 23 10:44:33 compute-0 systemd-sysv-generator[156501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:44:33 compute-0 systemd-rc-local-generator[156496]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:44:33 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 23 10:44:33 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 23 10:44:33 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 23 10:44:33 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 23 10:44:33 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 23 10:44:33 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 23 10:44:33 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 23 10:44:33 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 23 10:44:33 compute-0 sudo[156469]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:33 compute-0 sudo[156692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojidwczpxryekqkstugnhhmrljjxfwyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843473.5438387-2113-148408384868919/AnsiballZ_systemd.py'
Feb 23 10:44:33 compute-0 sudo[156692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:34 compute-0 python3.9[156695]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:44:34 compute-0 systemd[1]: Reloading.
Feb 23 10:44:34 compute-0 systemd-rc-local-generator[156720]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:44:34 compute-0 systemd-sysv-generator[156723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:44:34 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 23 10:44:34 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 23 10:44:34 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 23 10:44:34 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 23 10:44:34 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 23 10:44:34 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 23 10:44:34 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 23 10:44:34 compute-0 sudo[156692]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:34 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 23 10:44:34 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 23 10:44:34 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 23 10:44:34 compute-0 sudo[156918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iawnlmhjokuoflztllfhcekrgfagvunm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843474.5889528-2113-4729159170165/AnsiballZ_systemd.py'
Feb 23 10:44:34 compute-0 sudo[156918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:35 compute-0 python3.9[156922]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:44:35 compute-0 systemd[1]: Reloading.
Feb 23 10:44:35 compute-0 systemd-rc-local-generator[156947]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:44:35 compute-0 systemd-sysv-generator[156950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:44:35 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Feb 23 10:44:35 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 23 10:44:35 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 23 10:44:35 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 23 10:44:35 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 23 10:44:35 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 23 10:44:35 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 23 10:44:35 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 23 10:44:35 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 23 10:44:35 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 23 10:44:35 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 23 10:44:35 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 23 10:44:35 compute-0 sudo[156918]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:35 compute-0 setroubleshoot[156739]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 06e98477-82fb-4f4f-8e16-d48704d6c3aa
Feb 23 10:44:35 compute-0 setroubleshoot[156739]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 23 10:44:35 compute-0 setroubleshoot[156739]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 06e98477-82fb-4f4f-8e16-d48704d6c3aa
Feb 23 10:44:35 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:44:35 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:44:35 compute-0 setroubleshoot[156739]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 23 10:44:36 compute-0 sudo[157145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apxmhltobfbzuenywfpkawottaabqqvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843475.6521704-2113-49409455778271/AnsiballZ_systemd.py'
Feb 23 10:44:36 compute-0 sudo[157145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:36 compute-0 python3.9[157148]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:44:36 compute-0 systemd[1]: Reloading.
Feb 23 10:44:36 compute-0 systemd-rc-local-generator[157176]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:44:36 compute-0 systemd-sysv-generator[157179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:44:36 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Feb 23 10:44:36 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Feb 23 10:44:36 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 23 10:44:36 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 23 10:44:36 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 23 10:44:36 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 23 10:44:36 compute-0 systemd[1]: Starting libvirt secret daemon...
Feb 23 10:44:36 compute-0 systemd[1]: Started libvirt secret daemon.
Feb 23 10:44:36 compute-0 sudo[157145]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:37 compute-0 sudo[157364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nimjajwibgjjvdmynhsbqyeypokkadkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843477.1095288-2187-60093848891727/AnsiballZ_file.py'
Feb 23 10:44:37 compute-0 sudo[157364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:37 compute-0 python3.9[157367]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:37 compute-0 sudo[157364]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:38 compute-0 sudo[157517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eepkfxxyvomskchgfefoieqzcdgclajh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843477.8167546-2203-64099820422704/AnsiballZ_find.py'
Feb 23 10:44:38 compute-0 sudo[157517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:38 compute-0 python3.9[157520]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 10:44:38 compute-0 sudo[157517]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:38 compute-0 sudo[157670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oygqysgjtljotfugdpfpbflcqgchykra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843478.7402792-2231-81143080384188/AnsiballZ_stat.py'
Feb 23 10:44:38 compute-0 sudo[157670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:39 compute-0 python3.9[157673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:39 compute-0 sudo[157670]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:39 compute-0 sudo[157794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkiijiecsysgnatjqwbjvppouklyswzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843478.7402792-2231-81143080384188/AnsiballZ_copy.py'
Feb 23 10:44:39 compute-0 sudo[157794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:39 compute-0 python3.9[157797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843478.7402792-2231-81143080384188/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:39 compute-0 sudo[157794]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:40 compute-0 sudo[157947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxcamvlqgjhwbkiswathukrtufdnxglk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843480.1812794-2263-34998406973271/AnsiballZ_file.py'
Feb 23 10:44:40 compute-0 sudo[157947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:40 compute-0 python3.9[157950]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:40 compute-0 sudo[157947]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:41 compute-0 sudo[158100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onyhhytxmkirnuecqtsavaayuhkhtsgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843480.8012538-2279-35254253773289/AnsiballZ_stat.py'
Feb 23 10:44:41 compute-0 sudo[158100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:41 compute-0 python3.9[158103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:41 compute-0 sudo[158100]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:41 compute-0 sudo[158179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyccaflvnigpwwzyfyxmzanmmyieqycy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843480.8012538-2279-35254253773289/AnsiballZ_file.py'
Feb 23 10:44:41 compute-0 sudo[158179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:41 compute-0 python3.9[158182]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:41 compute-0 sudo[158179]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:42 compute-0 sudo[158332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bssuqwvmhddtemodfrkuhpfpvthouras ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843482.0824015-2303-195930739636588/AnsiballZ_stat.py'
Feb 23 10:44:42 compute-0 sudo[158332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:42 compute-0 python3.9[158335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:42 compute-0 sudo[158332]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:42 compute-0 sudo[158411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itxsprrrgchrmmiipentzihxxzunenrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843482.0824015-2303-195930739636588/AnsiballZ_file.py'
Feb 23 10:44:42 compute-0 sudo[158411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:42 compute-0 python3.9[158414]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.p56gq4xv recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:43 compute-0 sudo[158411]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:43 compute-0 sudo[158564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zehdnhyhecrfafkibmsrmuajddavscdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843483.3362136-2327-186557481593089/AnsiballZ_stat.py'
Feb 23 10:44:43 compute-0 sudo[158564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:43 compute-0 python3.9[158567]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:43 compute-0 sudo[158564]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:43 compute-0 podman[158570]: 2026-02-23 10:44:43.924359706 +0000 UTC m=+0.060990308 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 23 10:44:44 compute-0 sudo[158660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qubioedzvnagafviqilewobispbpupmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843483.3362136-2327-186557481593089/AnsiballZ_file.py'
Feb 23 10:44:44 compute-0 sudo[158660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:44 compute-0 python3.9[158663]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:44 compute-0 sudo[158660]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:44 compute-0 sudo[158813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbjenyfdqncqhxlvzpgmrryttavjrcdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843484.5168762-2353-242124234183553/AnsiballZ_command.py'
Feb 23 10:44:44 compute-0 sudo[158813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:45 compute-0 python3.9[158816]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:44:45 compute-0 sudo[158813]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:45 compute-0 sudo[158967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-winjurjxaykpgyqdgrikgsnvvwiaqkay ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843485.347377-2369-255807965915210/AnsiballZ_edpm_nftables_from_files.py'
Feb 23 10:44:45 compute-0 sudo[158967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:45 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 23 10:44:45 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 23 10:44:45 compute-0 python3[158970]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 10:44:45 compute-0 sudo[158967]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:46 compute-0 sudo[159120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjckmkiqcwwfildtlsjirqsmpkaenbru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843486.128025-2385-220987389786028/AnsiballZ_stat.py'
Feb 23 10:44:46 compute-0 sudo[159120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:46 compute-0 python3.9[159123]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:46 compute-0 sudo[159120]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:46 compute-0 sudo[159199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grtkoggbnqsukzwdgaiankgtwdgvyajw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843486.128025-2385-220987389786028/AnsiballZ_file.py'
Feb 23 10:44:46 compute-0 sudo[159199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:47 compute-0 python3.9[159202]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:47 compute-0 sudo[159199]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:47 compute-0 sudo[159367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmxhrrxwpcvvndyfqvuvzcjmiqkoolhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843487.2081907-2409-132884854639694/AnsiballZ_stat.py'
Feb 23 10:44:47 compute-0 sudo[159367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:47 compute-0 podman[159326]: 2026-02-23 10:44:47.538617993 +0000 UTC m=+0.103214027 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 10:44:47 compute-0 python3.9[159378]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:47 compute-0 sudo[159367]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:47 compute-0 sudo[159504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lueadjvkwhjnpfrqlncfromrzrikodlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843487.2081907-2409-132884854639694/AnsiballZ_copy.py'
Feb 23 10:44:47 compute-0 sudo[159504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:48 compute-0 python3.9[159507]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843487.2081907-2409-132884854639694/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:48 compute-0 sudo[159504]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:48 compute-0 sudo[159657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcugihzppcgxzaxpvdhnulwjxjioalum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843488.469871-2439-115909351054294/AnsiballZ_stat.py'
Feb 23 10:44:48 compute-0 sudo[159657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:48 compute-0 python3.9[159660]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:48 compute-0 sudo[159657]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:49 compute-0 sudo[159736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtncxybzzyvkdifihruklyzcrsudvuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843488.469871-2439-115909351054294/AnsiballZ_file.py'
Feb 23 10:44:49 compute-0 sudo[159736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:49 compute-0 python3.9[159739]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:49 compute-0 sudo[159736]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:49 compute-0 sudo[159889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzgbtvlqalkxxmuxcviblwfrwpffombc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843489.5927615-2463-157882888828559/AnsiballZ_stat.py'
Feb 23 10:44:49 compute-0 sudo[159889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:50 compute-0 python3.9[159892]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:50 compute-0 sudo[159889]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:50 compute-0 sudo[159968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkcvfovaswtzkovuultcosltoybdihl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843489.5927615-2463-157882888828559/AnsiballZ_file.py'
Feb 23 10:44:50 compute-0 sudo[159968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:50 compute-0 python3.9[159971]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:50 compute-0 sudo[159968]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:51 compute-0 sudo[160121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iubhupdmhcztegdkvfvdqeypwsucfxrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843490.793838-2487-131542850848957/AnsiballZ_stat.py'
Feb 23 10:44:51 compute-0 sudo[160121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:52 compute-0 python3.9[160124]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:52 compute-0 sudo[160121]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:52 compute-0 sudo[160247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xesedepywqijdlepsaqrmmfldlfaypii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843490.793838-2487-131542850848957/AnsiballZ_copy.py'
Feb 23 10:44:52 compute-0 sudo[160247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:52 compute-0 python3.9[160250]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843490.793838-2487-131542850848957/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:52 compute-0 sudo[160247]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:53 compute-0 sudo[160400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baqwuckiontpmxuncwyqxodjkwzktdee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843492.8491907-2517-61157926363025/AnsiballZ_file.py'
Feb 23 10:44:53 compute-0 sudo[160400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:53 compute-0 python3.9[160403]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:53 compute-0 sudo[160400]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:53 compute-0 sudo[160553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqwuevjqijhkpdzesqbcpnkevkyjuybj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843493.5399122-2533-52610496497643/AnsiballZ_command.py'
Feb 23 10:44:53 compute-0 sudo[160553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:53 compute-0 python3.9[160556]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:44:54 compute-0 sudo[160553]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:54 compute-0 sudo[160709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plvymzswgsqbwppwkncoqwuehjmpeqpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843494.2674854-2549-270085681388194/AnsiballZ_blockinfile.py'
Feb 23 10:44:54 compute-0 sudo[160709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:54 compute-0 python3.9[160712]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:54 compute-0 sudo[160709]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:55 compute-0 sshd-session[160737]: Connection closed by authenticating user root 143.198.30.3 port 49718 [preauth]
Feb 23 10:44:55 compute-0 sudo[160864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwpnmnnfxydvsgciojdaoqoxdzfhmqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843495.2581093-2567-61489094568089/AnsiballZ_command.py'
Feb 23 10:44:55 compute-0 sudo[160864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:55 compute-0 python3.9[160867]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:44:55 compute-0 sudo[160864]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:56 compute-0 sudo[161018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqndlnwuyzpikfcxzvmzlsujfqkmpvzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843495.8976824-2583-6250410858929/AnsiballZ_stat.py'
Feb 23 10:44:56 compute-0 sudo[161018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:56 compute-0 python3.9[161021]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:44:56 compute-0 sudo[161018]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:56 compute-0 sshd-session[161123]: Connection closed by authenticating user root 165.227.79.48 port 47020 [preauth]
Feb 23 10:44:56 compute-0 sudo[161175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxttiaurqltejoikpdymgapmbsldezjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843496.513356-2599-97000237856655/AnsiballZ_command.py'
Feb 23 10:44:56 compute-0 sudo[161175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:56 compute-0 python3.9[161178]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:44:57 compute-0 sudo[161175]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:57 compute-0 sudo[161331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcocpjslmzdzsndzyadxvsgvgsoskgrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843497.2712507-2615-113424080213393/AnsiballZ_file.py'
Feb 23 10:44:57 compute-0 sudo[161331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:57 compute-0 python3.9[161334]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:57 compute-0 sudo[161331]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:58 compute-0 sudo[161484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmnndfnrkzvxzoijokjmydkdgjitkwzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843498.1235147-2631-264227207918385/AnsiballZ_stat.py'
Feb 23 10:44:58 compute-0 sudo[161484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:58 compute-0 python3.9[161487]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:58 compute-0 sudo[161484]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:58 compute-0 sudo[161608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sauyagtbfuhtmpxsmvymgyiewuxnlixk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843498.1235147-2631-264227207918385/AnsiballZ_copy.py'
Feb 23 10:44:58 compute-0 sudo[161608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:59 compute-0 python3.9[161611]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843498.1235147-2631-264227207918385/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:44:59 compute-0 sudo[161608]: pam_unix(sudo:session): session closed for user root
Feb 23 10:44:59 compute-0 sudo[161761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdkoyvimjwlmungynrxzjjcjjcnegato ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843499.3884249-2661-234782458075370/AnsiballZ_stat.py'
Feb 23 10:44:59 compute-0 sudo[161761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:44:59 compute-0 python3.9[161764]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:44:59 compute-0 sudo[161761]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:00 compute-0 sudo[161885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofsqyuaknzyndmlokkfcuskggrycitja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843499.3884249-2661-234782458075370/AnsiballZ_copy.py'
Feb 23 10:45:00 compute-0 sudo[161885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:00 compute-0 python3.9[161888]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843499.3884249-2661-234782458075370/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:00 compute-0 sudo[161885]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:00 compute-0 sudo[162038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcavyyewinebvdyrqphamyferjthjqjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843500.6262765-2691-53934718494213/AnsiballZ_stat.py'
Feb 23 10:45:00 compute-0 sudo[162038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:01 compute-0 python3.9[162041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:45:01 compute-0 sudo[162038]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:01 compute-0 sudo[162162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqgvswuqjqjwixuzdipbdfhghpfflrwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843500.6262765-2691-53934718494213/AnsiballZ_copy.py'
Feb 23 10:45:01 compute-0 sudo[162162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:01 compute-0 python3.9[162165]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843500.6262765-2691-53934718494213/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:01 compute-0 sudo[162162]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:03 compute-0 sudo[162315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhskrxoispetfgttkiztwaxocuvmjrzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843502.7636693-2721-21558296700359/AnsiballZ_systemd.py'
Feb 23 10:45:03 compute-0 sudo[162315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:03 compute-0 python3.9[162318]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:45:03 compute-0 systemd[1]: Reloading.
Feb 23 10:45:03 compute-0 systemd-sysv-generator[162345]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:45:03 compute-0 systemd-rc-local-generator[162342]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:45:03 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Feb 23 10:45:03 compute-0 sudo[162315]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:04 compute-0 sudo[162514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfcbfkyxzabldieizencfbhdalmevwcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843503.8306658-2737-11452178407804/AnsiballZ_systemd.py'
Feb 23 10:45:04 compute-0 sudo[162514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:04 compute-0 python3.9[162517]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 10:45:04 compute-0 systemd[1]: Reloading.
Feb 23 10:45:04 compute-0 systemd-rc-local-generator[162544]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:45:04 compute-0 systemd-sysv-generator[162549]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:45:04 compute-0 systemd[1]: Reloading.
Feb 23 10:45:04 compute-0 systemd-rc-local-generator[162586]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:45:04 compute-0 systemd-sysv-generator[162589]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:45:04 compute-0 sudo[162514]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:05 compute-0 sshd-session[107552]: Connection closed by 192.168.122.30 port 50462
Feb 23 10:45:05 compute-0 sshd-session[107549]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:45:05 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Feb 23 10:45:05 compute-0 systemd[1]: session-23.scope: Consumed 2min 40.397s CPU time.
Feb 23 10:45:05 compute-0 systemd-logind[808]: Session 23 logged out. Waiting for processes to exit.
Feb 23 10:45:05 compute-0 systemd-logind[808]: Removed session 23.
Feb 23 10:45:11 compute-0 sshd-session[162628]: Accepted publickey for zuul from 192.168.122.30 port 33018 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:45:11 compute-0 systemd-logind[808]: New session 24 of user zuul.
Feb 23 10:45:11 compute-0 systemd[1]: Started Session 24 of User zuul.
Feb 23 10:45:11 compute-0 sshd-session[162628]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:45:12 compute-0 python3.9[162781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:45:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:45:12.621 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:45:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:45:12.623 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:45:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:45:12.623 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:45:13 compute-0 python3.9[162935]: ansible-ansible.builtin.service_facts Invoked
Feb 23 10:45:13 compute-0 network[162952]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 10:45:13 compute-0 network[162953]: 'network-scripts' will be removed from distribution in near future.
Feb 23 10:45:13 compute-0 network[162954]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 10:45:14 compute-0 podman[162965]: 2026-02-23 10:45:14.045910406 +0000 UTC m=+0.070243565 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 23 10:45:17 compute-0 sudo[163243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtvamqqgqypuuwcwwsoweyyuzoabthit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843517.119773-69-189225867100791/AnsiballZ_setup.py'
Feb 23 10:45:17 compute-0 sudo[163243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:17 compute-0 python3.9[163246]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 10:45:17 compute-0 podman[163255]: 2026-02-23 10:45:17.90186784 +0000 UTC m=+0.094002741 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:45:17 compute-0 sudo[163243]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:18 compute-0 sudo[163355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbudzrirbzmmgilvfyytxfntsvurvauj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843517.119773-69-189225867100791/AnsiballZ_dnf.py'
Feb 23 10:45:18 compute-0 sudo[163355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:18 compute-0 python3.9[163358]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:45:22 compute-0 sshd-session[163360]: Invalid user admin from 185.156.73.233 port 51150
Feb 23 10:45:22 compute-0 sshd-session[163360]: Connection closed by invalid user admin 185.156.73.233 port 51150 [preauth]
Feb 23 10:45:23 compute-0 sudo[163355]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:24 compute-0 sudo[163511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avllurreoktyhgxqtaicberojaalqoyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843523.9447885-93-268289622870852/AnsiballZ_stat.py'
Feb 23 10:45:24 compute-0 sudo[163511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:24 compute-0 python3.9[163514]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:45:24 compute-0 sudo[163511]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:25 compute-0 sudo[163664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jffogcwbroewegbykqedwwvtavbkpuom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843524.9302034-113-3580408350195/AnsiballZ_command.py'
Feb 23 10:45:25 compute-0 sudo[163664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:25 compute-0 python3.9[163667]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:45:25 compute-0 sudo[163664]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:26 compute-0 sudo[163818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqptqujqgoskhqjfeimfzlfiwiualawa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843525.8865535-133-213885853694495/AnsiballZ_stat.py'
Feb 23 10:45:26 compute-0 sudo[163818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:26 compute-0 python3.9[163821]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:45:26 compute-0 sudo[163818]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:26 compute-0 sudo[163971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urezzrcdfdgefkpuyfxpgimkhlcytjfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843526.4535415-149-66875680549118/AnsiballZ_command.py'
Feb 23 10:45:26 compute-0 sudo[163971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:26 compute-0 python3.9[163974]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:45:26 compute-0 sudo[163971]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:27 compute-0 sudo[164125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcrdkfbfzqqzmqmvaffonmaqrzllyahu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843527.0913434-165-101990515390591/AnsiballZ_stat.py'
Feb 23 10:45:27 compute-0 sudo[164125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:27 compute-0 python3.9[164128]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:45:27 compute-0 sudo[164125]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:27 compute-0 sudo[164251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqkyuabongdexdcjbwrqvgksnfrlnagk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843527.0913434-165-101990515390591/AnsiballZ_copy.py'
Feb 23 10:45:27 compute-0 sudo[164251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:27 compute-0 sshd-session[164202]: Connection closed by authenticating user root 143.198.30.3 port 40128 [preauth]
Feb 23 10:45:28 compute-0 python3.9[164254]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843527.0913434-165-101990515390591/.source.iscsi _original_basename=.24vi7xm8 follow=False checksum=9f5d36b024e3d704d6276b5869bbcde649dc8509 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:28 compute-0 sudo[164251]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:28 compute-0 sudo[164404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-retbpbnpwvwzuqwvquhzrwmffyvfaeux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843528.3710089-195-144627126710883/AnsiballZ_file.py'
Feb 23 10:45:28 compute-0 sudo[164404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:28 compute-0 python3.9[164407]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:28 compute-0 sudo[164404]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:29 compute-0 sudo[164557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tafjrvetdgsztjjuxttwlhjyebtsxdob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843529.1751785-211-206928294396603/AnsiballZ_lineinfile.py'
Feb 23 10:45:29 compute-0 sudo[164557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:29 compute-0 python3.9[164560]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:29 compute-0 sudo[164557]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:30 compute-0 sudo[164710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcfsbfsuggucbxasmjehrgrmulpyqrmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843529.9772105-229-45709677959521/AnsiballZ_systemd_service.py'
Feb 23 10:45:30 compute-0 sudo[164710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:30 compute-0 python3.9[164713]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:45:30 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 23 10:45:30 compute-0 sudo[164710]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:31 compute-0 sudo[164867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaebvexghxqwaoqlwjsrthkssmdmtpre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843531.0815294-245-196758358779192/AnsiballZ_systemd_service.py'
Feb 23 10:45:31 compute-0 sudo[164867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:31 compute-0 python3.9[164870]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:45:31 compute-0 systemd[1]: Reloading.
Feb 23 10:45:31 compute-0 systemd-sysv-generator[164902]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:45:31 compute-0 systemd-rc-local-generator[164897]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:45:32 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 23 10:45:32 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 23 10:45:32 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Feb 23 10:45:32 compute-0 systemd[1]: Started Open-iSCSI.
Feb 23 10:45:32 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 23 10:45:32 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 23 10:45:32 compute-0 sudo[164867]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:33 compute-0 python3.9[165076]: ansible-ansible.builtin.service_facts Invoked
Feb 23 10:45:33 compute-0 network[165093]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 10:45:33 compute-0 network[165094]: 'network-scripts' will be removed from distribution in near future.
Feb 23 10:45:33 compute-0 network[165095]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 10:45:37 compute-0 sudo[165365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acovowkpswtpwtebbevdwqbhwhuadnxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843537.296707-291-4846845695637/AnsiballZ_dnf.py'
Feb 23 10:45:37 compute-0 sudo[165365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:37 compute-0 python3.9[165368]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:45:39 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:45:39 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:45:39 compute-0 systemd[1]: Reloading.
Feb 23 10:45:39 compute-0 systemd-sysv-generator[165416]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:45:39 compute-0 systemd-rc-local-generator[165413]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:45:39 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:45:40 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:45:40 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:45:40 compute-0 systemd[1]: run-ra515d8d1102c4d1cb6740147d7fb1ec3.service: Deactivated successfully.
Feb 23 10:45:40 compute-0 sudo[165365]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:41 compute-0 sudo[165697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwoejknayycnwtcyaxkkaohufvojfdsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843541.1342719-309-13965088329201/AnsiballZ_file.py'
Feb 23 10:45:41 compute-0 sudo[165697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:41 compute-0 python3.9[165700]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 10:45:41 compute-0 sudo[165697]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:42 compute-0 sudo[165850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncdtntyqeseowrxkzbcjdbbimvoqhoda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843541.9354486-325-274827794832601/AnsiballZ_modprobe.py'
Feb 23 10:45:42 compute-0 sudo[165850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:42 compute-0 python3.9[165853]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 23 10:45:42 compute-0 sudo[165850]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:43 compute-0 sudo[166007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjuajvtltpmbkawjphyeirkljsjaaarj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843542.8183403-341-84897353569494/AnsiballZ_stat.py'
Feb 23 10:45:43 compute-0 sudo[166007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:43 compute-0 python3.9[166010]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:45:43 compute-0 sudo[166007]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:43 compute-0 sudo[166131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtljwjluiezygzfwpdizxaxdwbfvqprf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843542.8183403-341-84897353569494/AnsiballZ_copy.py'
Feb 23 10:45:43 compute-0 sudo[166131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:43 compute-0 python3.9[166134]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843542.8183403-341-84897353569494/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:43 compute-0 sudo[166131]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:44 compute-0 sudo[166295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlidmbgojswnyrlilfqpvtwvchsdpprr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843544.26624-373-23187953141580/AnsiballZ_lineinfile.py'
Feb 23 10:45:44 compute-0 sudo[166295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:44 compute-0 podman[166258]: 2026-02-23 10:45:44.540850632 +0000 UTC m=+0.060912121 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 10:45:44 compute-0 python3.9[166300]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:44 compute-0 sudo[166295]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:45 compute-0 sudo[166457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzbprwaviwuvilkmbvvglcwxtvqywdpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843544.878235-389-95678615755420/AnsiballZ_systemd.py'
Feb 23 10:45:45 compute-0 sudo[166457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:45 compute-0 python3.9[166460]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:45:45 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 23 10:45:45 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 23 10:45:45 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 23 10:45:45 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 23 10:45:45 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 23 10:45:45 compute-0 sudo[166457]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:46 compute-0 sudo[166614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wauokovqdgtilrrsbrrzsgpeqgabgtlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843546.0309303-405-138108396771294/AnsiballZ_command.py'
Feb 23 10:45:46 compute-0 sudo[166614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:46 compute-0 python3.9[166617]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:45:46 compute-0 sudo[166614]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:47 compute-0 sudo[166768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njfgpuvjkuvselluywnqowzgodqiyidb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843546.8410144-425-88895108516162/AnsiballZ_stat.py'
Feb 23 10:45:47 compute-0 sudo[166768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:47 compute-0 python3.9[166771]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:45:47 compute-0 sudo[166768]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:47 compute-0 sshd-session[166796]: Connection closed by authenticating user root 165.227.79.48 port 32934 [preauth]
Feb 23 10:45:47 compute-0 sudo[166923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrarldkkyvqstxwhucerwqjwanbqvikq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843547.6006439-443-187760756430043/AnsiballZ_stat.py'
Feb 23 10:45:47 compute-0 sudo[166923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:48 compute-0 python3.9[166926]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:45:48 compute-0 sudo[166923]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:48 compute-0 podman[166927]: 2026-02-23 10:45:48.163504856 +0000 UTC m=+0.069909174 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 10:45:48 compute-0 sudo[167073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbrnzanbnqlggdcnvdqpfjbsnenpiwmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843547.6006439-443-187760756430043/AnsiballZ_copy.py'
Feb 23 10:45:48 compute-0 sudo[167073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:48 compute-0 python3.9[167076]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843547.6006439-443-187760756430043/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:48 compute-0 sudo[167073]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:49 compute-0 sudo[167226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hanacazmkmrvkucqkspkuymofiwzqisq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843548.9570544-473-248189836619877/AnsiballZ_command.py'
Feb 23 10:45:49 compute-0 sudo[167226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:49 compute-0 python3.9[167229]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:45:49 compute-0 sudo[167226]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:49 compute-0 sudo[167380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edvtgzoirldssjhdyfnqqmtqemjxuqzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843549.6932297-489-154380642348034/AnsiballZ_lineinfile.py'
Feb 23 10:45:49 compute-0 sudo[167380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:50 compute-0 python3.9[167383]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:50 compute-0 sudo[167380]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:50 compute-0 sudo[167533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpkmrzhzskdqoepqizaecqtypnnghqnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843550.4455671-505-250409718124406/AnsiballZ_replace.py'
Feb 23 10:45:50 compute-0 sudo[167533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:51 compute-0 python3.9[167536]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:51 compute-0 sudo[167533]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:52 compute-0 sudo[167686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orrudtyddyogvyudcwanlvcvdatcqhee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843551.2484057-521-199760931565400/AnsiballZ_replace.py'
Feb 23 10:45:52 compute-0 sudo[167686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:52 compute-0 python3.9[167689]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:52 compute-0 sudo[167686]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:52 compute-0 sudo[167839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnyiigibzptpvvhqjmsqviagvxxuwghk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843552.4446652-539-248945950847212/AnsiballZ_lineinfile.py'
Feb 23 10:45:52 compute-0 sudo[167839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:52 compute-0 python3.9[167842]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:52 compute-0 sudo[167839]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:53 compute-0 sudo[167992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eooiiqqooviouyonumrmywgmpnqljjbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843552.9875576-539-94777260183871/AnsiballZ_lineinfile.py'
Feb 23 10:45:53 compute-0 sudo[167992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:53 compute-0 python3.9[167995]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:53 compute-0 sudo[167992]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:53 compute-0 sudo[168145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvdhdqnwvtstxixuonuutgkmcijbpder ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843553.58902-539-66874789129996/AnsiballZ_lineinfile.py'
Feb 23 10:45:53 compute-0 sudo[168145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:53 compute-0 python3.9[168148]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:54 compute-0 sudo[168145]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:54 compute-0 sudo[168298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvhywlvbkpkotzggtatzbcppxiyojjxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843554.1424148-539-153761900544802/AnsiballZ_lineinfile.py'
Feb 23 10:45:54 compute-0 sudo[168298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:54 compute-0 python3.9[168301]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:45:54 compute-0 sudo[168298]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:55 compute-0 sudo[168451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jidduucjepqrknngsoccceffzkkrysmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843555.2075129-597-240924780265537/AnsiballZ_stat.py'
Feb 23 10:45:55 compute-0 sudo[168451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:55 compute-0 python3.9[168454]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:45:55 compute-0 sudo[168451]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:56 compute-0 sudo[168606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciyfzatxnnpzjvekuozlcuuuvlxoqgzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843555.926881-613-256332490881145/AnsiballZ_command.py'
Feb 23 10:45:56 compute-0 sudo[168606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:56 compute-0 python3.9[168609]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:45:56 compute-0 sudo[168606]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:57 compute-0 sudo[168760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duzdpjoytyyktaaarfpjmttkizrpzvhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843556.7348888-631-206403912694833/AnsiballZ_systemd_service.py'
Feb 23 10:45:57 compute-0 sudo[168760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:57 compute-0 python3.9[168763]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:45:57 compute-0 systemd[1]: Listening on multipathd control socket.
Feb 23 10:45:57 compute-0 sudo[168760]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:57 compute-0 sudo[168917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cacjznfviefxbausaqbqsdjkvutxzren ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843557.7334673-647-28713896724194/AnsiballZ_systemd_service.py'
Feb 23 10:45:57 compute-0 sudo[168917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:58 compute-0 python3.9[168920]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:45:58 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 23 10:45:58 compute-0 udevadm[168925]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 23 10:45:58 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 23 10:45:58 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 23 10:45:58 compute-0 multipathd[168928]: --------start up--------
Feb 23 10:45:58 compute-0 multipathd[168928]: read /etc/multipath.conf
Feb 23 10:45:58 compute-0 multipathd[168928]: path checkers start up
Feb 23 10:45:58 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 23 10:45:58 compute-0 sudo[168917]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:59 compute-0 sudo[169086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mennhauijnpsyprvekuknjugxjlofwzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843559.0390067-671-217056496256113/AnsiballZ_file.py'
Feb 23 10:45:59 compute-0 sudo[169086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:45:59 compute-0 python3.9[169089]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 10:45:59 compute-0 sudo[169086]: pam_unix(sudo:session): session closed for user root
Feb 23 10:45:59 compute-0 sudo[169239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwwkofcmomcedrbdybsreusoblgtbxsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843559.7081132-687-220311169360397/AnsiballZ_modprobe.py'
Feb 23 10:45:59 compute-0 sudo[169239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:00 compute-0 python3.9[169242]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 23 10:46:00 compute-0 kernel: Key type psk registered
Feb 23 10:46:00 compute-0 sudo[169239]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:00 compute-0 sudo[169402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfdmzspkgjoljquoptjtanyfulxfqcel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843560.5228388-703-87519766857179/AnsiballZ_stat.py'
Feb 23 10:46:00 compute-0 sudo[169402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:00 compute-0 python3.9[169405]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:46:00 compute-0 sudo[169402]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:01 compute-0 sudo[169526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnazzwlznchtqqkxhvxjvvnrhkgiydxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843560.5228388-703-87519766857179/AnsiballZ_copy.py'
Feb 23 10:46:01 compute-0 sudo[169526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:01 compute-0 python3.9[169529]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843560.5228388-703-87519766857179/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:01 compute-0 sudo[169526]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:02 compute-0 sshd-session[169554]: Connection closed by authenticating user root 143.198.30.3 port 44750 [preauth]
Feb 23 10:46:02 compute-0 sudo[169681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxpggtxipmqeqmcswjuijcivoyjxgalb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843561.9521296-735-240900810430006/AnsiballZ_lineinfile.py'
Feb 23 10:46:02 compute-0 sudo[169681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:02 compute-0 python3.9[169684]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:02 compute-0 sudo[169681]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:03 compute-0 sudo[169834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucazdgfzpvpktemtfcqdtikopyysdgln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843562.8784223-751-195697143694264/AnsiballZ_systemd.py'
Feb 23 10:46:03 compute-0 sudo[169834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:03 compute-0 python3.9[169837]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:46:03 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 23 10:46:03 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 23 10:46:03 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 23 10:46:03 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 23 10:46:03 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 23 10:46:03 compute-0 sudo[169834]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:03 compute-0 sudo[169991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysifagnoohxtwrwpatedppiyzsseglyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843563.7535017-767-14804194637303/AnsiballZ_dnf.py'
Feb 23 10:46:03 compute-0 sudo[169991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:04 compute-0 python3.9[169994]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 10:46:06 compute-0 systemd[1]: Reloading.
Feb 23 10:46:06 compute-0 systemd-rc-local-generator[170018]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:46:06 compute-0 systemd-sysv-generator[170024]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:46:06 compute-0 systemd[1]: Reloading.
Feb 23 10:46:06 compute-0 systemd-sysv-generator[170068]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:46:06 compute-0 systemd-rc-local-generator[170063]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:46:07 compute-0 systemd-logind[808]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 23 10:46:07 compute-0 systemd-logind[808]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 23 10:46:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 10:46:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 23 10:46:07 compute-0 systemd[1]: Reloading.
Feb 23 10:46:07 compute-0 systemd-sysv-generator[170171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:46:07 compute-0 systemd-rc-local-generator[170168]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:46:07 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 10:46:07 compute-0 sudo[169991]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:08 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 10:46:08 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 23 10:46:08 compute-0 systemd[1]: run-r834e5a9009264ebbbb85aae4d23ee1ab.service: Deactivated successfully.
Feb 23 10:46:08 compute-0 sudo[171486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnwldmbxpfzmbrgezbcsdwzronwsrfda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843568.1575422-783-32609963012168/AnsiballZ_systemd_service.py'
Feb 23 10:46:08 compute-0 sudo[171486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:08 compute-0 python3.9[171489]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:46:08 compute-0 systemd[1]: Stopping Open-iSCSI...
Feb 23 10:46:08 compute-0 iscsid[164916]: iscsid shutting down.
Feb 23 10:46:08 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Feb 23 10:46:08 compute-0 systemd[1]: Stopped Open-iSCSI.
Feb 23 10:46:08 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 23 10:46:08 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 23 10:46:08 compute-0 systemd[1]: Started Open-iSCSI.
Feb 23 10:46:08 compute-0 sudo[171486]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:09 compute-0 sudo[171644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltdhugojyuhzsbvcecolreqlmfadhxpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843569.1331818-799-215936260199234/AnsiballZ_systemd_service.py'
Feb 23 10:46:09 compute-0 sudo[171644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:09 compute-0 python3.9[171647]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:46:09 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 23 10:46:09 compute-0 multipathd[168928]: exit (signal)
Feb 23 10:46:09 compute-0 multipathd[168928]: --------shut down-------
Feb 23 10:46:09 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Feb 23 10:46:09 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 23 10:46:09 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 23 10:46:09 compute-0 multipathd[171653]: --------start up--------
Feb 23 10:46:09 compute-0 multipathd[171653]: read /etc/multipath.conf
Feb 23 10:46:09 compute-0 multipathd[171653]: path checkers start up
Feb 23 10:46:09 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 23 10:46:09 compute-0 sudo[171644]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:09 compute-0 auditd[719]: Audit daemon rotating log files
Feb 23 10:46:10 compute-0 python3.9[171811]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:46:11 compute-0 sudo[171965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhjokhorgexoqfnhhkjjndxiuruldjum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843571.2004406-834-56512284815153/AnsiballZ_file.py'
Feb 23 10:46:11 compute-0 sudo[171965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:11 compute-0 python3.9[171968]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:11 compute-0 sudo[171965]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:12 compute-0 sudo[172118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kulhqgprdirawqxlxvmwmiafuarmdaem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843572.1168401-856-223819911550486/AnsiballZ_systemd_service.py'
Feb 23 10:46:12 compute-0 sudo[172118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:46:12.623 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:46:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:46:12.624 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:46:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:46:12.624 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:46:12 compute-0 python3.9[172121]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:46:12 compute-0 systemd[1]: Reloading.
Feb 23 10:46:12 compute-0 systemd-sysv-generator[172151]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:46:12 compute-0 systemd-rc-local-generator[172146]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:46:12 compute-0 sudo[172118]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:13 compute-0 python3.9[172313]: ansible-ansible.builtin.service_facts Invoked
Feb 23 10:46:13 compute-0 network[172330]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 10:46:13 compute-0 network[172331]: 'network-scripts' will be removed from distribution in near future.
Feb 23 10:46:13 compute-0 network[172332]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 10:46:14 compute-0 podman[172367]: 2026-02-23 10:46:14.616331481 +0000 UTC m=+0.046318125 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:46:17 compute-0 sudo[172622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkqxnictxdiafdlfwvehrhjwbdpvxoyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843577.4291854-894-244169449524132/AnsiballZ_systemd_service.py'
Feb 23 10:46:17 compute-0 sudo[172622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:17 compute-0 python3.9[172625]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:17 compute-0 sudo[172622]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:18 compute-0 sudo[172793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqxexjdvbblshjulxbsnwfzqxobrlpur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843578.0514543-894-137180781363759/AnsiballZ_systemd_service.py'
Feb 23 10:46:18 compute-0 sudo[172793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:18 compute-0 podman[172750]: 2026-02-23 10:46:18.358546978 +0000 UTC m=+0.092743483 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 10:46:18 compute-0 python3.9[172802]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:18 compute-0 sudo[172793]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:19 compute-0 sudo[172957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szdfcgrjloiinkjrihgmsnsuprmcffqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843578.774645-894-147143056949553/AnsiballZ_systemd_service.py'
Feb 23 10:46:19 compute-0 sudo[172957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:19 compute-0 python3.9[172960]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:19 compute-0 sudo[172957]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:19 compute-0 sudo[173111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouslhsapztuftbkussoxkzxhuzbgrqum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843579.664181-894-54793094906176/AnsiballZ_systemd_service.py'
Feb 23 10:46:19 compute-0 sudo[173111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:20 compute-0 python3.9[173114]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:20 compute-0 sudo[173111]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:20 compute-0 sudo[173265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywqgurkwabmdsvbhcpdhveinieugirxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843580.4424136-894-163931186281303/AnsiballZ_systemd_service.py'
Feb 23 10:46:20 compute-0 sudo[173265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:21 compute-0 python3.9[173268]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:21 compute-0 sudo[173265]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:21 compute-0 sudo[173419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhhmruhimzfcxqnffbogtchscefbjpsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843581.1994734-894-25488236796329/AnsiballZ_systemd_service.py'
Feb 23 10:46:21 compute-0 sudo[173419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:21 compute-0 python3.9[173422]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:21 compute-0 sudo[173419]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:22 compute-0 sudo[173573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztcfhfykfxfnhbqpnrqgwrceccqwxgfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843581.8212988-894-35680322495449/AnsiballZ_systemd_service.py'
Feb 23 10:46:22 compute-0 sudo[173573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:22 compute-0 python3.9[173576]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:22 compute-0 sudo[173573]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:22 compute-0 sudo[173727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btpmdsqwuhhjrtezzbdpeosucbgvoaeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843582.490284-894-4278591884529/AnsiballZ_systemd_service.py'
Feb 23 10:46:22 compute-0 sudo[173727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:23 compute-0 python3.9[173730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:46:23 compute-0 sudo[173727]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:24 compute-0 sudo[173881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoynmckzzfzbptahsnkiicbwqmpaimeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843584.5618198-1012-135020800905136/AnsiballZ_file.py'
Feb 23 10:46:24 compute-0 sudo[173881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:24 compute-0 python3.9[173884]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:24 compute-0 sudo[173881]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:25 compute-0 sudo[174034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qebzrupwawuvrcnaceecgzfuqplpincd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843585.0681098-1012-189949740243628/AnsiballZ_file.py'
Feb 23 10:46:25 compute-0 sudo[174034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:25 compute-0 python3.9[174037]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:25 compute-0 sudo[174034]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:25 compute-0 sudo[174187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtyeiryyawnbtrlaxiqbuczdjpengmzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843585.5559053-1012-265082615697212/AnsiballZ_file.py'
Feb 23 10:46:25 compute-0 sudo[174187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:25 compute-0 python3.9[174190]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:25 compute-0 sudo[174187]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:26 compute-0 sudo[174340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxnisafukirjrydffxmwpesmummjwlli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843586.0842304-1012-147684696850637/AnsiballZ_file.py'
Feb 23 10:46:26 compute-0 sudo[174340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:26 compute-0 python3.9[174343]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:26 compute-0 sudo[174340]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:26 compute-0 sudo[174493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyidhahashtoxrlulwerafcudqftvvga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843586.6224878-1012-273125312728986/AnsiballZ_file.py'
Feb 23 10:46:26 compute-0 sudo[174493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:27 compute-0 python3.9[174496]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:27 compute-0 sudo[174493]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:27 compute-0 sudo[174646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frbmbgjukfbmrecrgwceqwzrrjvdkgui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843587.2608526-1012-19096330588030/AnsiballZ_file.py'
Feb 23 10:46:27 compute-0 sudo[174646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:27 compute-0 python3.9[174649]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:27 compute-0 sudo[174646]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:27 compute-0 sudo[174799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjcccribbegzciolmwxkuowqbbsymqss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843587.7387438-1012-186643890572851/AnsiballZ_file.py'
Feb 23 10:46:27 compute-0 sudo[174799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:28 compute-0 python3.9[174802]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:28 compute-0 sudo[174799]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:28 compute-0 sudo[174952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxymkwnckcfamnhoytgjbksesljertas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843588.2695487-1012-47773670115644/AnsiballZ_file.py'
Feb 23 10:46:28 compute-0 sudo[174952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:28 compute-0 python3.9[174955]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:28 compute-0 sudo[174952]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:30 compute-0 sudo[175105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psqzfwihxhendvtssbhdnybwftdthedz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843589.8589258-1126-202617698818963/AnsiballZ_file.py'
Feb 23 10:46:30 compute-0 sudo[175105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:30 compute-0 python3.9[175108]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:30 compute-0 sudo[175105]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:30 compute-0 sudo[175258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhvsaswlhiktnkfbnisnqmtkjazxltxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843590.4059129-1126-63169234141882/AnsiballZ_file.py'
Feb 23 10:46:30 compute-0 sudo[175258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:30 compute-0 python3.9[175261]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:30 compute-0 sudo[175258]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:31 compute-0 sudo[175411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geoakiktmtszcwxrwlsdltkmepnkbnnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843591.0769773-1126-29394915264632/AnsiballZ_file.py'
Feb 23 10:46:31 compute-0 sudo[175411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:31 compute-0 python3.9[175414]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:31 compute-0 sudo[175411]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:31 compute-0 sudo[175564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mysassqryiyuekjnctnijknnizmtmkfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843591.589549-1126-275052300189211/AnsiballZ_file.py'
Feb 23 10:46:31 compute-0 sudo[175564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:31 compute-0 python3.9[175567]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:31 compute-0 sudo[175564]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:32 compute-0 sudo[175717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rztxdtosvepqajgydgpgqmhmkfmwsate ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843592.0818305-1126-44930244773930/AnsiballZ_file.py'
Feb 23 10:46:32 compute-0 sudo[175717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:32 compute-0 python3.9[175720]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:32 compute-0 sudo[175717]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:32 compute-0 sudo[175870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lebdhjoacrsgleilroeosuurdxcdfgfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843592.6893942-1126-95821317318809/AnsiballZ_file.py'
Feb 23 10:46:32 compute-0 sudo[175870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:33 compute-0 python3.9[175873]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:33 compute-0 sudo[175870]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:33 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 23 10:46:33 compute-0 sudo[176024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wenfchrdvwcxmszcimmbjldriabesdtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843593.26444-1126-115960580045144/AnsiballZ_file.py'
Feb 23 10:46:33 compute-0 sudo[176024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:33 compute-0 python3.9[176027]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:33 compute-0 sudo[176024]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:34 compute-0 sudo[176179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noayxbpqbithvppardvpxjkxpumqlwwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843593.8348262-1126-157833450145570/AnsiballZ_file.py'
Feb 23 10:46:34 compute-0 sshd-session[176127]: Connection closed by authenticating user root 143.198.30.3 port 46866 [preauth]
Feb 23 10:46:34 compute-0 sudo[176179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:34 compute-0 python3.9[176182]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:46:34 compute-0 sudo[176179]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:34 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 23 10:46:35 compute-0 sudo[176333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muzredbdjtpybhtfxpjzxywfehzixpoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843594.8919294-1242-92472296382523/AnsiballZ_command.py'
Feb 23 10:46:35 compute-0 sudo[176333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:35 compute-0 python3.9[176336]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:35 compute-0 sudo[176333]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:35 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 23 10:46:36 compute-0 python3.9[176489]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 10:46:36 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 23 10:46:36 compute-0 sudo[176640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaskvlgpkqchxjqefpsdxgmdsbtyenqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843596.62651-1278-30476555577198/AnsiballZ_systemd_service.py'
Feb 23 10:46:36 compute-0 sudo[176640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:37 compute-0 python3.9[176643]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:46:37 compute-0 systemd[1]: Reloading.
Feb 23 10:46:37 compute-0 systemd-sysv-generator[176675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:46:37 compute-0 systemd-rc-local-generator[176670]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:46:37 compute-0 sudo[176640]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:37 compute-0 sshd-session[176710]: Connection closed by authenticating user root 165.227.79.48 port 38560 [preauth]
Feb 23 10:46:37 compute-0 sudo[176837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-texhycnxgkngchksenjmoljandikiiwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843597.6548915-1294-112746185612995/AnsiballZ_command.py'
Feb 23 10:46:37 compute-0 sudo[176837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:38 compute-0 python3.9[176840]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:39 compute-0 sudo[176837]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:39 compute-0 sudo[176991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gonyicmuxueynqbvsniodrlsiaalmyah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843599.2325678-1294-75970305021569/AnsiballZ_command.py'
Feb 23 10:46:39 compute-0 sudo[176991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:39 compute-0 python3.9[176994]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:39 compute-0 sudo[176991]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:40 compute-0 sudo[177145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfefjgkteltbvltsopqvwcizytdgooqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843599.8603444-1294-277904841955683/AnsiballZ_command.py'
Feb 23 10:46:40 compute-0 sudo[177145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:40 compute-0 python3.9[177148]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:40 compute-0 sudo[177145]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:40 compute-0 sudo[177299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxjevsddzdgtrosadelvjxhyxsyiwgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843600.4905608-1294-225139533402741/AnsiballZ_command.py'
Feb 23 10:46:40 compute-0 sudo[177299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:40 compute-0 python3.9[177302]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:40 compute-0 sudo[177299]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:41 compute-0 sudo[177453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxkfrzxxjxdguazcqmkxaghlyhdajiwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843601.0449526-1294-86068033863844/AnsiballZ_command.py'
Feb 23 10:46:41 compute-0 sudo[177453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:41 compute-0 python3.9[177456]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:41 compute-0 sudo[177453]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:41 compute-0 sudo[177607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxbibjtkdzkwrowfwwacsszryxqxdgnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843601.6103683-1294-75150930526367/AnsiballZ_command.py'
Feb 23 10:46:41 compute-0 sudo[177607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:42 compute-0 python3.9[177610]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:42 compute-0 sudo[177607]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:42 compute-0 sudo[177761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emwcfviltxzmdpdmpccvzqprirqdjokl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843602.1992702-1294-181303988847464/AnsiballZ_command.py'
Feb 23 10:46:42 compute-0 sudo[177761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:42 compute-0 python3.9[177764]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:42 compute-0 sudo[177761]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:43 compute-0 sudo[177915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elesurovhehplfotegpsfcgygpvqqpqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843602.861096-1294-136515058527401/AnsiballZ_command.py'
Feb 23 10:46:43 compute-0 sudo[177915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:43 compute-0 python3.9[177918]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:46:43 compute-0 sudo[177915]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:44 compute-0 sudo[178076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdhrkvjtrdhofbzgwczcseknovxifsdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843604.4490986-1437-143406143300474/AnsiballZ_file.py'
Feb 23 10:46:44 compute-0 sudo[178076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:44 compute-0 podman[178043]: 2026-02-23 10:46:44.741735348 +0000 UTC m=+0.069121557 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:46:44 compute-0 python3.9[178083]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:44 compute-0 sudo[178076]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:45 compute-0 sudo[178239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekzlomarifkbymmxdnvtdtwltmkhuwli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843605.0114546-1437-270039951570912/AnsiballZ_file.py'
Feb 23 10:46:45 compute-0 sudo[178239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:45 compute-0 python3.9[178242]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:45 compute-0 sudo[178239]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:45 compute-0 sudo[178392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pydbyzntlwoxyzbtxfqsmzbpgjwjovdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843605.6597614-1467-137894303571530/AnsiballZ_file.py'
Feb 23 10:46:45 compute-0 sudo[178392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:46 compute-0 python3.9[178395]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:46 compute-0 sudo[178392]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:46 compute-0 sudo[178545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lheztepuvkszsgtvvqkptljjzihvecki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843606.209889-1467-43689098602893/AnsiballZ_file.py'
Feb 23 10:46:46 compute-0 sudo[178545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:46 compute-0 python3.9[178548]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:46 compute-0 sudo[178545]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:47 compute-0 sudo[178698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfthhsdldvywhhrhhaqpefieptpdmrra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843606.8406804-1467-271934545087984/AnsiballZ_file.py'
Feb 23 10:46:47 compute-0 sudo[178698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:47 compute-0 python3.9[178701]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:47 compute-0 sudo[178698]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:47 compute-0 sudo[178851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gknsilkrvlxytntgbosopivjxjknlyey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843607.4845233-1467-182587303909659/AnsiballZ_file.py'
Feb 23 10:46:47 compute-0 sudo[178851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:47 compute-0 python3.9[178854]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:47 compute-0 sudo[178851]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:48 compute-0 sudo[179004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdenahryupcgdrdbdtkfudkqjfeyjfsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843608.084213-1467-268854487359333/AnsiballZ_file.py'
Feb 23 10:46:48 compute-0 sudo[179004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:48 compute-0 podman[179006]: 2026-02-23 10:46:48.513485091 +0000 UTC m=+0.098058960 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 10:46:48 compute-0 python3.9[179008]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:48 compute-0 sudo[179004]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:49 compute-0 sudo[179183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apmocjmcoxiealtcengfmwwjqdbbjnps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843608.7769063-1467-97717938151459/AnsiballZ_file.py'
Feb 23 10:46:49 compute-0 sudo[179183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:49 compute-0 python3.9[179186]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:49 compute-0 sudo[179183]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:49 compute-0 sudo[179336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmwalvwktphrwmtpiptmseglhzvmtxea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843609.3609703-1467-180838969823509/AnsiballZ_file.py'
Feb 23 10:46:49 compute-0 sudo[179336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:49 compute-0 python3.9[179339]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:46:49 compute-0 sudo[179336]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:54 compute-0 sudo[179489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkwtjgqdbyfeeqqcmuenktxlvqqdqffv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843614.53651-1704-17773072716970/AnsiballZ_getent.py'
Feb 23 10:46:54 compute-0 sudo[179489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:55 compute-0 python3.9[179492]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 23 10:46:55 compute-0 sudo[179489]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:55 compute-0 sudo[179643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fplpfnfcrdsbpkidfidfasfeuvmkwxek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843615.3105989-1720-150839804375592/AnsiballZ_group.py'
Feb 23 10:46:55 compute-0 sudo[179643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:55 compute-0 python3.9[179646]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 10:46:55 compute-0 groupadd[179647]: group added to /etc/group: name=nova, GID=42436
Feb 23 10:46:55 compute-0 groupadd[179647]: group added to /etc/gshadow: name=nova
Feb 23 10:46:55 compute-0 groupadd[179647]: new group: name=nova, GID=42436
Feb 23 10:46:55 compute-0 sudo[179643]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:56 compute-0 sudo[179802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvbcxqdockramuliutryoxfkkbeqpaxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843616.1603053-1736-92659848781307/AnsiballZ_user.py'
Feb 23 10:46:56 compute-0 sudo[179802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:46:56 compute-0 python3.9[179805]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 10:46:56 compute-0 useradd[179807]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Feb 23 10:46:56 compute-0 useradd[179807]: add 'nova' to group 'libvirt'
Feb 23 10:46:56 compute-0 useradd[179807]: add 'nova' to shadow group 'libvirt'
Feb 23 10:46:56 compute-0 sudo[179802]: pam_unix(sudo:session): session closed for user root
Feb 23 10:46:58 compute-0 sshd-session[179838]: Accepted publickey for zuul from 192.168.122.30 port 34178 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:46:58 compute-0 systemd-logind[808]: New session 25 of user zuul.
Feb 23 10:46:58 compute-0 systemd[1]: Started Session 25 of User zuul.
Feb 23 10:46:58 compute-0 sshd-session[179838]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:46:58 compute-0 sshd-session[179841]: Received disconnect from 192.168.122.30 port 34178:11: disconnected by user
Feb 23 10:46:58 compute-0 sshd-session[179841]: Disconnected from user zuul 192.168.122.30 port 34178
Feb 23 10:46:58 compute-0 sshd-session[179838]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:46:58 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Feb 23 10:46:58 compute-0 systemd-logind[808]: Session 25 logged out. Waiting for processes to exit.
Feb 23 10:46:58 compute-0 systemd-logind[808]: Removed session 25.
Feb 23 10:46:58 compute-0 python3.9[179991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:46:59 compute-0 python3.9[180067]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:47:00 compute-0 python3.9[180217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:00 compute-0 python3.9[180338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843619.5219045-1786-273626003001264/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:47:01 compute-0 python3.9[180488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:01 compute-0 python3.9[180609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843620.6756394-1786-270114329049663/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:47:02 compute-0 python3.9[180759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:02 compute-0 python3.9[180880]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843621.7041645-1786-273803838193051/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:47:03 compute-0 python3.9[181030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:03 compute-0 python3.9[181151]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843622.9343424-1894-75940125639743/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:47:04 compute-0 sudo[181301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iikdjwlkijjscefkbxkihyieolceiypy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843624.070052-1924-102106979490957/AnsiballZ_file.py'
Feb 23 10:47:04 compute-0 sudo[181301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:04 compute-0 python3.9[181304]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:04 compute-0 sudo[181301]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:05 compute-0 sudo[181454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-octewjknzjmfmvdbwohfczezyeueyfjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843624.746272-1940-37035914574413/AnsiballZ_copy.py'
Feb 23 10:47:05 compute-0 sudo[181454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:05 compute-0 python3.9[181457]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:05 compute-0 sudo[181454]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:05 compute-0 sudo[181607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eixamsjittfavgbxqfrkbygpwopzctvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843625.452367-1956-70302603430966/AnsiballZ_stat.py'
Feb 23 10:47:05 compute-0 sudo[181607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:05 compute-0 python3.9[181610]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:05 compute-0 sudo[181607]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:06 compute-0 sudo[181760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sezcnirzbtnosakubsdiaiobclmtrphe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843626.1569874-1972-226731323716143/AnsiballZ_stat.py'
Feb 23 10:47:06 compute-0 sudo[181760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:06 compute-0 python3.9[181763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:06 compute-0 sudo[181760]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:06 compute-0 sudo[181884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysqlpqsnvgvrldjupcxvszdzekcnoaoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843626.1569874-1972-226731323716143/AnsiballZ_copy.py'
Feb 23 10:47:06 compute-0 sudo[181884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:07 compute-0 python3.9[181887]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1771843626.1569874-1972-226731323716143/.source _original_basename=.tdf45eay follow=False checksum=69aa05a372bbdceea92b8dffe1b484fbc5c30ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 23 10:47:07 compute-0 sudo[181884]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:07 compute-0 sshd-session[181989]: Connection closed by authenticating user root 143.198.30.3 port 60406 [preauth]
Feb 23 10:47:07 compute-0 python3.9[182041]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:08 compute-0 sudo[182193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyezqqbrpcnrfiseigjxlfmeeygvivpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843628.2688146-2028-15655016499974/AnsiballZ_file.py'
Feb 23 10:47:08 compute-0 sudo[182193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:08 compute-0 python3.9[182196]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:08 compute-0 sudo[182193]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:09 compute-0 sudo[182346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dngdnnigmhctesxtqnnwuyjixrggboit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843628.937244-2044-155750040042136/AnsiballZ_file.py'
Feb 23 10:47:09 compute-0 sudo[182346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:09 compute-0 python3.9[182349]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:47:09 compute-0 sudo[182346]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:10 compute-0 python3.9[182499]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:12 compute-0 sudo[182920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvmihtgdyaldvhtegygxemjnqpdmzqlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843631.7528973-2112-222719586981384/AnsiballZ_container_config_data.py'
Feb 23 10:47:12 compute-0 sudo[182920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:12 compute-0 python3.9[182923]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 23 10:47:12 compute-0 sudo[182920]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:47:12.625 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:47:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:47:12.626 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:47:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:47:12.626 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:47:13 compute-0 sudo[183073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwysskhdbtgezxwngsgkkqdyoatrbora ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843632.8585017-2134-254451440928808/AnsiballZ_container_config_hash.py'
Feb 23 10:47:13 compute-0 sudo[183073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:13 compute-0 python3.9[183076]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 10:47:13 compute-0 sudo[183073]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:14 compute-0 sudo[183226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eddelicwknleszacqwvtcaiqmaxgwiyr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843633.8740547-2154-275546072314370/AnsiballZ_edpm_container_manage.py'
Feb 23 10:47:14 compute-0 sudo[183226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:14 compute-0 python3[183229]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 10:47:14 compute-0 podman[183266]: 2026-02-23 10:47:14.653619756 +0000 UTC m=+0.048360792 container create 4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260216, config_id=nova_compute_init)
Feb 23 10:47:14 compute-0 podman[183266]: 2026-02-23 10:47:14.627899339 +0000 UTC m=+0.022640435 image pull 72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 10:47:14 compute-0 python3[183229]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 23 10:47:14 compute-0 sudo[183226]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:14 compute-0 podman[183303]: 2026-02-23 10:47:14.844370221 +0000 UTC m=+0.048932098 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:47:15 compute-0 sudo[183471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpfjdygtifyjteedwmhbvccjptyjqusg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843635.0396762-2170-245231433435065/AnsiballZ_stat.py'
Feb 23 10:47:15 compute-0 sudo[183471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:15 compute-0 python3.9[183474]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:15 compute-0 sudo[183471]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:16 compute-0 python3.9[183626]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 10:47:17 compute-0 sudo[183776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yftnxcfswdvjvyjvteljtluohbovbbuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843637.1385262-2224-141383311587445/AnsiballZ_stat.py'
Feb 23 10:47:17 compute-0 sudo[183776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:17 compute-0 python3.9[183779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:17 compute-0 sudo[183776]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:17 compute-0 sudo[183902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dklqkejobwgepxzvbbglfsywwwdfrzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843637.1385262-2224-141383311587445/AnsiballZ_copy.py'
Feb 23 10:47:17 compute-0 sudo[183902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:17 compute-0 python3.9[183905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843637.1385262-2224-141383311587445/.source.yaml _original_basename=.rs1i84ky follow=False checksum=3cad4d9ff70adbb8d4e3ecb522db823181444395 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:18 compute-0 sudo[183902]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:18 compute-0 sudo[184067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrqiqznbrjumdohrtwunhtlebzpayivr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843638.5854402-2258-153882555689913/AnsiballZ_file.py'
Feb 23 10:47:18 compute-0 sudo[184067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:18 compute-0 podman[184029]: 2026-02-23 10:47:18.887019838 +0000 UTC m=+0.081598560 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:47:19 compute-0 python3.9[184074]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:19 compute-0 sudo[184067]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:19 compute-0 sudo[184233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rapbxsxoogxotptmwbxeqdkgkvxwnjxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843639.3911695-2274-250827277715872/AnsiballZ_file.py'
Feb 23 10:47:19 compute-0 sudo[184233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:19 compute-0 python3.9[184236]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:47:19 compute-0 sudo[184233]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:20 compute-0 sudo[184386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emrbxyotzqvmedhrhyibmogfocjwmdcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843640.0861287-2290-8503012203469/AnsiballZ_stat.py'
Feb 23 10:47:20 compute-0 sudo[184386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:20 compute-0 python3.9[184389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:20 compute-0 sudo[184386]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:20 compute-0 sudo[184510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axzyljqjugandjzfwztoqtniqfzhsaii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843640.0861287-2290-8503012203469/AnsiballZ_copy.py'
Feb 23 10:47:20 compute-0 sudo[184510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:21 compute-0 python3.9[184513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843640.0861287-2290-8503012203469/.source.json _original_basename=.auyhmdzx follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:21 compute-0 sudo[184510]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:21 compute-0 python3.9[184663]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:23 compute-0 sudo[185084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sztyyyrinjqhsopdvismcseolwoyeoxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843643.4320211-2370-199991807349778/AnsiballZ_container_config_data.py'
Feb 23 10:47:23 compute-0 sudo[185084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:24 compute-0 python3.9[185087]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 23 10:47:24 compute-0 sudo[185084]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:24 compute-0 sudo[185237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqdzabkeusgllxkvxgoazypsxsffkqfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843644.6116483-2392-173070794083977/AnsiballZ_container_config_hash.py'
Feb 23 10:47:24 compute-0 sudo[185237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:25 compute-0 python3.9[185240]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 10:47:25 compute-0 sudo[185237]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:25 compute-0 sshd-session[185249]: Connection closed by authenticating user root 165.227.79.48 port 37880 [preauth]
Feb 23 10:47:25 compute-0 sudo[185392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzwvvadjomqjbosppdskxaiittanjvpu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843645.4917836-2412-183723944407426/AnsiballZ_edpm_container_manage.py'
Feb 23 10:47:25 compute-0 sudo[185392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:26 compute-0 python3[185395]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 10:47:26 compute-0 podman[185432]: 2026-02-23 10:47:26.202194064 +0000 UTC m=+0.055931165 container create 1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:47:26 compute-0 podman[185432]: 2026-02-23 10:47:26.170069346 +0000 UTC m=+0.023806547 image pull 72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 10:47:26 compute-0 python3[185395]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 23 10:47:26 compute-0 sudo[185392]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:28 compute-0 sudo[185620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdibcmqcttjkcuiwnpltsutaomujrynb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843647.6713798-2428-256716676955324/AnsiballZ_stat.py'
Feb 23 10:47:28 compute-0 sudo[185620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:28 compute-0 python3.9[185623]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:28 compute-0 sudo[185620]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:28 compute-0 sudo[185775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skzvpqpaykoldtkavcxxzdcmtyjxhdig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843648.6262276-2446-159908262382073/AnsiballZ_file.py'
Feb 23 10:47:28 compute-0 sudo[185775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:29 compute-0 python3.9[185778]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:29 compute-0 sudo[185775]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:29 compute-0 sudo[185852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jojbbfqdkfnyqrqkgalegjhqrlxafhev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843648.6262276-2446-159908262382073/AnsiballZ_stat.py'
Feb 23 10:47:29 compute-0 sudo[185852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:29 compute-0 python3.9[185855]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:29 compute-0 sudo[185852]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:29 compute-0 sudo[186004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfyntiavxerdakypxlqtjworydqsdejn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843649.5204096-2446-188093306452233/AnsiballZ_copy.py'
Feb 23 10:47:29 compute-0 sudo[186004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:30 compute-0 python3.9[186007]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771843649.5204096-2446-188093306452233/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:30 compute-0 sudo[186004]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:30 compute-0 sudo[186081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejpgystxhidcbywkrzgzfdujlywhzhma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843649.5204096-2446-188093306452233/AnsiballZ_systemd.py'
Feb 23 10:47:30 compute-0 sudo[186081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:30 compute-0 python3.9[186084]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:47:30 compute-0 systemd[1]: Reloading.
Feb 23 10:47:30 compute-0 systemd-sysv-generator[186110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:47:30 compute-0 systemd-rc-local-generator[186107]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:47:30 compute-0 sudo[186081]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:30 compute-0 sudo[186200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbviqjsquuvmyvzydghmkykarmjjllbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843649.5204096-2446-188093306452233/AnsiballZ_systemd.py'
Feb 23 10:47:30 compute-0 sudo[186200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:31 compute-0 python3.9[186203]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:47:31 compute-0 systemd[1]: Reloading.
Feb 23 10:47:31 compute-0 systemd-rc-local-generator[186234]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:47:31 compute-0 systemd-sysv-generator[186241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:47:31 compute-0 systemd[1]: Starting nova_compute container...
Feb 23 10:47:31 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:47:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:31 compute-0 podman[186250]: 2026-02-23 10:47:31.618258336 +0000 UTC m=+0.075901538 container init 1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:47:31 compute-0 podman[186250]: 2026-02-23 10:47:31.625624893 +0000 UTC m=+0.083268045 container start 1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, io.buildah.version=1.43.0)
Feb 23 10:47:31 compute-0 podman[186250]: nova_compute
Feb 23 10:47:31 compute-0 nova_compute[186266]: + sudo -E kolla_set_configs
Feb 23 10:47:31 compute-0 systemd[1]: Started nova_compute container.
Feb 23 10:47:31 compute-0 sudo[186200]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Validating config file
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying service configuration files
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Deleting /etc/ceph
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Creating directory /etc/ceph
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /etc/ceph
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Writing out command to execute
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:31 compute-0 nova_compute[186266]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 10:47:31 compute-0 nova_compute[186266]: ++ cat /run_command
Feb 23 10:47:31 compute-0 nova_compute[186266]: + CMD=nova-compute
Feb 23 10:47:31 compute-0 nova_compute[186266]: + ARGS=
Feb 23 10:47:31 compute-0 nova_compute[186266]: + sudo kolla_copy_cacerts
Feb 23 10:47:31 compute-0 nova_compute[186266]: + [[ ! -n '' ]]
Feb 23 10:47:31 compute-0 nova_compute[186266]: + . kolla_extend_start
Feb 23 10:47:31 compute-0 nova_compute[186266]: Running command: 'nova-compute'
Feb 23 10:47:31 compute-0 nova_compute[186266]: + echo 'Running command: '\''nova-compute'\'''
Feb 23 10:47:31 compute-0 nova_compute[186266]: + umask 0022
Feb 23 10:47:31 compute-0 nova_compute[186266]: + exec nova-compute
Feb 23 10:47:33 compute-0 python3.9[186428]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 10:47:33 compute-0 nova_compute[186266]: 2026-02-23 10:47:33.377 186270 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 10:47:33 compute-0 nova_compute[186266]: 2026-02-23 10:47:33.378 186270 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 10:47:33 compute-0 nova_compute[186266]: 2026-02-23 10:47:33.378 186270 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 10:47:33 compute-0 nova_compute[186266]: 2026-02-23 10:47:33.378 186270 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 23 10:47:33 compute-0 nova_compute[186266]: 2026-02-23 10:47:33.496 186270 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:47:33 compute-0 nova_compute[186266]: 2026-02-23 10:47:33.503 186270 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:47:33 compute-0 nova_compute[186266]: 2026-02-23 10:47:33.504 186270 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 23 10:47:34 compute-0 sudo[186582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmjvwfamfnkienpptvtnnbhrncmrwtau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843654.0433357-2536-45098380364574/AnsiballZ_stat.py'
Feb 23 10:47:34 compute-0 sudo[186582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:34 compute-0 python3.9[186585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:47:34 compute-0 sudo[186582]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.573 186270 INFO nova.virt.driver [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.665 186270 INFO nova.compute.provider_config [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.679 186270 DEBUG oslo_concurrency.lockutils [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.679 186270 DEBUG oslo_concurrency.lockutils [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.679 186270 DEBUG oslo_concurrency.lockutils [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.680 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.680 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.680 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.680 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.680 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.680 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.680 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.681 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.681 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.681 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.681 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.681 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.681 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.681 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.682 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.682 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.682 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.682 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.682 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.682 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.682 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.683 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.683 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.683 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.683 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.683 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.683 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.684 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.684 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.684 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.684 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.684 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.684 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.684 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.685 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.685 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.685 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.685 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.685 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.685 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.685 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.686 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.686 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.686 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.686 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.686 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.686 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.686 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.687 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.687 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.687 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.687 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.687 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.687 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.688 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.688 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.688 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.688 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.688 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.688 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.688 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.689 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.689 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.689 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.689 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.689 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.689 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.689 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.690 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.690 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.690 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.690 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.690 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.690 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.690 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.691 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.691 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.691 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.691 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.691 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.691 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.691 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.692 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.692 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.692 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.692 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.692 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.692 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.692 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.693 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.693 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.693 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.693 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.693 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.693 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.693 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.694 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.694 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.694 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.694 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.694 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.694 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.694 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.695 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.695 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.695 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.695 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.695 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.695 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.695 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.696 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.696 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.696 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.696 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.696 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.696 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.696 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.697 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.697 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.697 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.697 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.697 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.697 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.697 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.698 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.698 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.698 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.698 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.698 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.698 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.698 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.699 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.699 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.699 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.699 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.699 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.699 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.699 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.700 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.700 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.700 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.700 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.700 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.700 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.700 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.701 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.701 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.701 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.701 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.701 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.701 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.702 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.702 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.702 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.702 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.702 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.702 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.702 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.703 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.703 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.703 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.703 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.703 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.703 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.703 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.704 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.704 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.704 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.704 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.704 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.704 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.704 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.705 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.705 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.705 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.705 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.705 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.705 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.706 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.706 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.706 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.706 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.706 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.706 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.706 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.707 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.707 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.707 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.707 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.707 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.707 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.707 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.708 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.708 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.708 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.708 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.708 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.708 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.708 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.709 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.709 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.709 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.709 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.709 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.709 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.709 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.710 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.710 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.710 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.710 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.710 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.710 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.711 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.711 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.711 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.711 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.711 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.711 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.711 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.712 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.712 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.712 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.712 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.712 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.712 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.712 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.713 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.714 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.714 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.714 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.714 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.714 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.714 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.714 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.715 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.715 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.715 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.715 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.715 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.715 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.715 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.716 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.717 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.717 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.717 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.717 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.717 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.717 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.717 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.718 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.718 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.718 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.718 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.718 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.718 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.718 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.719 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.720 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.720 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.720 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.720 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.720 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.720 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.720 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.721 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.722 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.722 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.722 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.722 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.722 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.722 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.722 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.723 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.723 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.723 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.723 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.723 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.723 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.723 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.724 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.725 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.725 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.725 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.725 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.725 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.725 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.726 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.726 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.726 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.726 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.726 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.726 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.726 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.727 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.728 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.728 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.728 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.728 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.728 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.728 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.728 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.729 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.729 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.729 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.729 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.729 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.729 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.729 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.730 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.730 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.730 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.730 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.730 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.730 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.731 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.732 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.732 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.732 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.732 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.732 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.732 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.732 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.733 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.733 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.733 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.733 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.733 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.733 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.733 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.734 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.735 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.735 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.735 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.735 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.735 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.735 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.735 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.736 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.736 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.736 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.736 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.736 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.736 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.736 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.737 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.737 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.737 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.737 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.737 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.737 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.737 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.738 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.739 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.739 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.739 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.739 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.739 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.739 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.739 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.740 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.741 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.741 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.741 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.741 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.741 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.741 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.741 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.742 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.742 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.742 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.742 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.742 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.742 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.742 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.743 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.744 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.744 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.744 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.744 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.744 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.744 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.744 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.745 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.745 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.745 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.745 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.745 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.745 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.745 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.746 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.746 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.746 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.746 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.746 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.746 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.747 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.748 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.748 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.748 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.748 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.748 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.748 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.748 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.749 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.749 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.749 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.749 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.749 186270 WARNING oslo_config.cfg [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 23 10:47:34 compute-0 nova_compute[186266]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 23 10:47:34 compute-0 nova_compute[186266]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 23 10:47:34 compute-0 nova_compute[186266]: and ``live_migration_inbound_addr`` respectively.
Feb 23 10:47:34 compute-0 nova_compute[186266]: ).  Its value may be silently ignored in the future.
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.749 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.750 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.750 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.750 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.750 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.750 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.750 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.750 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.751 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.751 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.751 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.751 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.751 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.751 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.751 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.752 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.752 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.752 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.752 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.752 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.752 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.752 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.753 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.753 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.753 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.753 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.753 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.753 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.753 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.754 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.754 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.754 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.754 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.754 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.754 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.754 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.755 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.756 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.756 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.756 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.756 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.756 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.756 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.756 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.757 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.757 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.757 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.757 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.757 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.757 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.757 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.758 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.758 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.758 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.758 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.758 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.758 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.758 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.759 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.760 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.760 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.760 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.760 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.760 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.760 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.760 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.761 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.761 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.761 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.761 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.761 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.761 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.762 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.762 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.762 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.762 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.762 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.762 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.762 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.763 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.764 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.764 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.764 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.764 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.764 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.764 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.764 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.765 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.765 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.765 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.765 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.765 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.765 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.766 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.767 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.767 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.767 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.767 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.767 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.767 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.767 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.768 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.768 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.768 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.768 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.768 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.768 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.768 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.769 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.769 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.769 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.769 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.769 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.769 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.769 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.770 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.770 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.770 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.770 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.770 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.770 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.770 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.771 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.771 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.771 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.771 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.771 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.771 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.771 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.772 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.772 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.772 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.772 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.772 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.772 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.772 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.773 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.774 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.774 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.774 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.774 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.774 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.774 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.774 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.775 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.775 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.775 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.775 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.775 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.775 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.775 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.776 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.776 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.776 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.776 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.776 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.776 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.776 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.777 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.777 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.777 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.777 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.777 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.777 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.777 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.778 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.778 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.778 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.778 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.778 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.778 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.778 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.779 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.780 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.780 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.780 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.780 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.780 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.780 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.780 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.781 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.782 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.782 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.782 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.782 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.782 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.782 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.783 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.784 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.784 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.784 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.784 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.784 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.784 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.785 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.785 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.785 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.785 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.785 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.785 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.785 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.786 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.786 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.786 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.786 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.786 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.786 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.786 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.787 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.788 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.788 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.788 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.788 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.788 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.788 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.788 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.789 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.790 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.790 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.790 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.790 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.790 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.790 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.790 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.791 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.791 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.791 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.791 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.791 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.791 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.791 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.792 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.792 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.792 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.792 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.792 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.792 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.792 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.793 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.793 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.793 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.793 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.793 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.793 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.793 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.794 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.794 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.794 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.794 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.794 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.794 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.794 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.795 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.796 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.796 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.796 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.796 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.796 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.796 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.796 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.797 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.797 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.797 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.797 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.797 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.798 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.798 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.798 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.798 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.798 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.798 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.799 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.799 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.799 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.799 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.799 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.799 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.799 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.800 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.800 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.800 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.800 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.800 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.800 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.800 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.801 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.802 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.802 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.802 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.802 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 sudo[186708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoittcxzbebbejlwxnzjzxwxkfvjwrfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843654.0433357-2536-45098380364574/AnsiballZ_copy.py'
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.802 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.802 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.802 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.803 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.803 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.803 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.803 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.803 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.803 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.803 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.804 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.804 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.804 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.804 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.804 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.804 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.804 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.805 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.805 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.805 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.805 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.805 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.805 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.805 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.806 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 sudo[186708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.806 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.806 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.806 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.806 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.806 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.806 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.807 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.808 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.808 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.808 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.808 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.808 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.808 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.808 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.809 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.809 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.809 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.809 186270 DEBUG oslo_service.service [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.810 186270 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.840 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.841 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.841 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.841 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 23 10:47:34 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 23 10:47:34 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.907 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f294faed2b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.910 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f294faed2b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.910 186270 INFO nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Connection event '1' reason 'None'
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.937 186270 WARNING nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 23 10:47:34 compute-0 nova_compute[186266]: 2026-02-23 10:47:34.938 186270 DEBUG nova.virt.libvirt.volume.mount [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 23 10:47:34 compute-0 python3.9[186711]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843654.0433357-2536-45098380364574/.source.yaml _original_basename=.ck1_8qz6 follow=False checksum=d094858c9f7d0d9444251ac5c0df0fe25fe30ac2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:35 compute-0 sudo[186708]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.687 186270 INFO nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Libvirt host capabilities <capabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]: 
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <host>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <uuid>07d22930-8d23-4ccf-b924-bb75b3355502</uuid>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <arch>x86_64</arch>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model>EPYC-Rome-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <vendor>AMD</vendor>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <microcode version='16777317'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <signature family='23' model='49' stepping='0'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='x2apic'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='tsc-deadline'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='osxsave'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='hypervisor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='tsc_adjust'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='spec-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='stibp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='arch-capabilities'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='cmp_legacy'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='topoext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='virt-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='lbrv'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='tsc-scale'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='vmcb-clean'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='pause-filter'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='pfthreshold'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='svme-addr-chk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='rdctl-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='skip-l1dfl-vmentry'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='mds-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature name='pschange-mc-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <pages unit='KiB' size='4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <pages unit='KiB' size='2048'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <pages unit='KiB' size='1048576'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <power_management>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <suspend_mem/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <suspend_disk/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <suspend_hybrid/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </power_management>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <iommu support='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <migration_features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <live/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <uri_transports>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <uri_transport>tcp</uri_transport>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <uri_transport>rdma</uri_transport>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </uri_transports>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </migration_features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <topology>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <cells num='1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <cell id='0'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           <memory unit='KiB'>7864280</memory>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           <pages unit='KiB' size='4'>1966070</pages>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           <pages unit='KiB' size='2048'>0</pages>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           <distances>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <sibling id='0' value='10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           </distances>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           <cpus num='8'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:           </cpus>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         </cell>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </cells>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </topology>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <cache>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </cache>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <secmodel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model>selinux</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <doi>0</doi>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </secmodel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <secmodel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model>dac</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <doi>0</doi>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </secmodel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </host>
Feb 23 10:47:35 compute-0 nova_compute[186266]: 
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <guest>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <os_type>hvm</os_type>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <arch name='i686'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <wordsize>32</wordsize>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <domain type='qemu'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <domain type='kvm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </arch>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <pae/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <nonpae/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <acpi default='on' toggle='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <apic default='on' toggle='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <cpuselection/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <deviceboot/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <disksnapshot default='on' toggle='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <externalSnapshot/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </guest>
Feb 23 10:47:35 compute-0 nova_compute[186266]: 
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <guest>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <os_type>hvm</os_type>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <arch name='x86_64'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <wordsize>64</wordsize>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <domain type='qemu'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <domain type='kvm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </arch>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <acpi default='on' toggle='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <apic default='on' toggle='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <cpuselection/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <deviceboot/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <disksnapshot default='on' toggle='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <externalSnapshot/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </guest>
Feb 23 10:47:35 compute-0 nova_compute[186266]: 
Feb 23 10:47:35 compute-0 nova_compute[186266]: </capabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]: 
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.693 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.709 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 23 10:47:35 compute-0 nova_compute[186266]: <domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <domain>kvm</domain>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <arch>i686</arch>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <vcpu max='4096'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <iothreads supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <os supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='firmware'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <loader supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>rom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pflash</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='readonly'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>yes</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='secure'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </loader>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </os>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='maximumMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <vendor>AMD</vendor>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='succor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='custom' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <memoryBacking supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='sourceType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>anonymous</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>memfd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </memoryBacking>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <disk supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='diskDevice'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>disk</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cdrom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>floppy</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>lun</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>fdc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>sata</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </disk>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <graphics supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vnc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egl-headless</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </graphics>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <video supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='modelType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vga</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cirrus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>none</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>bochs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ramfb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </video>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hostdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='mode'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>subsystem</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='startupPolicy'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>mandatory</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>requisite</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>optional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='subsysType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pci</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='capsType'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='pciBackend'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hostdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <rng supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>random</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </rng>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <filesystem supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='driverType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>path</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>handle</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtiofs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </filesystem>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tpm supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-tis</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-crb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emulator</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>external</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendVersion'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>2.0</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </tpm>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <redirdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </redirdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <channel supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </channel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <crypto supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </crypto>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <interface supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>passt</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </interface>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <panic supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>isa</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>hyperv</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </panic>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <console supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>null</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dev</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pipe</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stdio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>udp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tcp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu-vdagent</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </console>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <gic supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <genid supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backup supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <async-teardown supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <s390-pv supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <ps2 supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tdx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sev supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sgx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hyperv supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='features'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>relaxed</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vapic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>spinlocks</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vpindex</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>runtime</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>synic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stimer</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reset</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vendor_id</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>frequencies</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reenlightenment</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tlbflush</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ipi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>avic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emsr_bitmap</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>xmm_input</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hyperv>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <launchSecurity supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </features>
Feb 23 10:47:35 compute-0 nova_compute[186266]: </domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.716 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 23 10:47:35 compute-0 nova_compute[186266]: <domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <domain>kvm</domain>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <arch>i686</arch>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <vcpu max='240'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <iothreads supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <os supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='firmware'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <loader supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>rom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pflash</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='readonly'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>yes</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='secure'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </loader>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </os>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='maximumMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <vendor>AMD</vendor>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='succor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='custom' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <memoryBacking supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='sourceType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>anonymous</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>memfd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </memoryBacking>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <disk supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='diskDevice'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>disk</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cdrom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>floppy</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>lun</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ide</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>fdc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>sata</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </disk>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <graphics supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vnc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egl-headless</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </graphics>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <video supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='modelType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vga</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cirrus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>none</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>bochs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ramfb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </video>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hostdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='mode'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>subsystem</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='startupPolicy'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>mandatory</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>requisite</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>optional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='subsysType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pci</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='capsType'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='pciBackend'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hostdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <rng supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>random</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </rng>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <filesystem supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='driverType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>path</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>handle</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtiofs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </filesystem>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tpm supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-tis</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-crb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emulator</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>external</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendVersion'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>2.0</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </tpm>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <redirdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </redirdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <channel supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </channel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <crypto supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </crypto>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <interface supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>passt</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </interface>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <panic supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>isa</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>hyperv</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </panic>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <console supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>null</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dev</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pipe</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stdio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>udp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tcp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu-vdagent</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </console>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <gic supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <genid supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backup supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <async-teardown supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <s390-pv supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <ps2 supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tdx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sev supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sgx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hyperv supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='features'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>relaxed</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vapic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>spinlocks</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vpindex</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>runtime</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>synic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stimer</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reset</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vendor_id</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>frequencies</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reenlightenment</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tlbflush</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ipi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>avic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emsr_bitmap</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>xmm_input</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hyperv>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <launchSecurity supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </features>
Feb 23 10:47:35 compute-0 nova_compute[186266]: </domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.759 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.763 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 23 10:47:35 compute-0 nova_compute[186266]: <domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <domain>kvm</domain>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <arch>x86_64</arch>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <vcpu max='4096'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <iothreads supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <os supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='firmware'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>efi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <loader supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>rom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pflash</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='readonly'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>yes</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='secure'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>yes</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </loader>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </os>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='maximumMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <vendor>AMD</vendor>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='succor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='custom' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <memoryBacking supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='sourceType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>anonymous</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>memfd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </memoryBacking>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <disk supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='diskDevice'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>disk</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cdrom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>floppy</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>lun</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>fdc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>sata</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </disk>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <graphics supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vnc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egl-headless</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </graphics>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <video supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='modelType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vga</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cirrus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>none</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>bochs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ramfb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </video>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hostdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='mode'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>subsystem</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='startupPolicy'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>mandatory</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>requisite</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>optional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='subsysType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pci</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='capsType'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='pciBackend'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hostdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <rng supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>random</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </rng>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <filesystem supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='driverType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>path</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>handle</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtiofs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </filesystem>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tpm supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-tis</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-crb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emulator</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>external</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendVersion'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>2.0</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </tpm>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <redirdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </redirdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <channel supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </channel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <crypto supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </crypto>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <interface supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>passt</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </interface>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <panic supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>isa</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>hyperv</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </panic>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <console supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>null</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dev</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pipe</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stdio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>udp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tcp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu-vdagent</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </console>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <gic supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <genid supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backup supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <async-teardown supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <s390-pv supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <ps2 supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tdx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sev supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sgx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hyperv supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='features'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>relaxed</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vapic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>spinlocks</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vpindex</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>runtime</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>synic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stimer</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reset</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vendor_id</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>frequencies</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reenlightenment</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tlbflush</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ipi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>avic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emsr_bitmap</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>xmm_input</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hyperv>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <launchSecurity supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </features>
Feb 23 10:47:35 compute-0 nova_compute[186266]: </domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.828 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 23 10:47:35 compute-0 nova_compute[186266]: <domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <domain>kvm</domain>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <arch>x86_64</arch>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <vcpu max='240'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <iothreads supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <os supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='firmware'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <loader supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>rom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pflash</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='readonly'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>yes</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='secure'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>no</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </loader>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </os>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='maximumMigratable'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>on</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>off</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <vendor>AMD</vendor>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='succor'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <mode name='custom' supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ddpd-u'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sha512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm3'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sm4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Denverton-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amd-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='auto-ibrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='perfmon-v2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbpb'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='stibp-always-on'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='EPYC-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 python3.9[186925]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-128'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-256'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx10-512'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='prefetchiti'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Haswell-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512er'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512pf'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fma4'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tbm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xop'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='amx-tile'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-bf16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-fp16'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bitalg'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrc'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fzrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='la57'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='taa-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ifma'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cmpccxadd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fbsdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='fsrs'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ibrs-all'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='intel-psfd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='lam'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mcdt-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pbrsb-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='psdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='serialize'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vaes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='hle'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='rtm'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512bw'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512cd'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512dq'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512f'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='avx512vl'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='invpcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pcid'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='pku'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='mpx'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='core-capability'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='split-lock-detect'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='cldemote'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='erms'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='gfni'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdir64b'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='movdiri'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='xsaves'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='athlon-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='core2duo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='coreduo-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='n270-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='ss'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <blockers model='phenom-v1'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnow'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <feature name='3dnowext'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </blockers>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </mode>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <memoryBacking supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <enum name='sourceType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>anonymous</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <value>memfd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </memoryBacking>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <disk supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='diskDevice'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>disk</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cdrom</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>floppy</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>lun</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ide</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>fdc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>sata</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </disk>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <graphics supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vnc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egl-headless</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </graphics>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <video supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='modelType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vga</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>cirrus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>none</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>bochs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ramfb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </video>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hostdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='mode'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>subsystem</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='startupPolicy'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>mandatory</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>requisite</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>optional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='subsysType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pci</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>scsi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='capsType'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='pciBackend'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hostdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <rng supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtio-non-transitional</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>random</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>egd</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </rng>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <filesystem supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='driverType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>path</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>handle</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>virtiofs</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </filesystem>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tpm supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-tis</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tpm-crb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emulator</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>external</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendVersion'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>2.0</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </tpm>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <redirdev supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='bus'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>usb</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </redirdev>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <channel supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </channel>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <crypto supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendModel'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>builtin</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </crypto>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <interface supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='backendType'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>default</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>passt</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </interface>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <panic supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='model'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>isa</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>hyperv</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </panic>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <console supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='type'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>null</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vc</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pty</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dev</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>file</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>pipe</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stdio</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>udp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tcp</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>unix</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>qemu-vdagent</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>dbus</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </console>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </devices>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <features>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <gic supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <genid supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <backup supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <async-teardown supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <s390-pv supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <ps2 supported='yes'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <tdx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sev supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <sgx supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <hyperv supported='yes'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <enum name='features'>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>relaxed</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vapic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>spinlocks</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vpindex</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>runtime</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>synic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>stimer</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reset</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>vendor_id</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>frequencies</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>reenlightenment</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>tlbflush</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>ipi</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>avic</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>emsr_bitmap</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <value>xmm_input</value>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </enum>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       <defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:35 compute-0 nova_compute[186266]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:35 compute-0 nova_compute[186266]:       </defaults>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     </hyperv>
Feb 23 10:47:35 compute-0 nova_compute[186266]:     <launchSecurity supported='no'/>
Feb 23 10:47:35 compute-0 nova_compute[186266]:   </features>
Feb 23 10:47:35 compute-0 nova_compute[186266]: </domainCapabilities>
Feb 23 10:47:35 compute-0 nova_compute[186266]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.891 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.891 186270 INFO nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Secure Boot support detected
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.894 186270 INFO nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.894 186270 INFO nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.907 186270 DEBUG nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 23 10:47:35 compute-0 nova_compute[186266]:   <model>Nehalem</model>
Feb 23 10:47:35 compute-0 nova_compute[186266]: </cpu>
Feb 23 10:47:35 compute-0 nova_compute[186266]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.910 186270 DEBUG nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.950 186270 INFO nova.virt.node [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Determined node identity 8ecb3de0-8241-4d60-9a57-9609e064b906 from /var/lib/nova/compute_id
Feb 23 10:47:35 compute-0 nova_compute[186266]: 2026-02-23 10:47:35.975 186270 WARNING nova.compute.manager [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Compute nodes ['8ecb3de0-8241-4d60-9a57-9609e064b906'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.027 186270 INFO nova.compute.manager [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.115 186270 WARNING nova.compute.manager [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.115 186270 DEBUG oslo_concurrency.lockutils [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.115 186270 DEBUG oslo_concurrency.lockutils [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.116 186270 DEBUG oslo_concurrency.lockutils [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.116 186270 DEBUG nova.compute.resource_tracker [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:47:36 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 23 10:47:36 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.371 186270 WARNING nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.371 186270 DEBUG nova.compute.resource_tracker [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6177MB free_disk=73.43740463256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.372 186270 DEBUG oslo_concurrency.lockutils [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.372 186270 DEBUG oslo_concurrency.lockutils [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.437 186270 WARNING nova.compute.resource_tracker [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] No compute node record for compute-0.ctlplane.example.com:8ecb3de0-8241-4d60-9a57-9609e064b906: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 8ecb3de0-8241-4d60-9a57-9609e064b906 could not be found.
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.489 186270 INFO nova.compute.resource_tracker [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 8ecb3de0-8241-4d60-9a57-9609e064b906
Feb 23 10:47:36 compute-0 python3.9[187098]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.610 186270 DEBUG nova.compute.resource_tracker [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:47:36 compute-0 nova_compute[186266]: 2026-02-23 10:47:36.610 186270 DEBUG nova.compute.resource_tracker [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:47:37 compute-0 python3.9[187248]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.770 186270 INFO nova.scheduler.client.report [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] [req-51fc8702-28cf-4a44-8d93-71b2c890b99c] Created resource provider record via placement API for resource provider with UUID 8ecb3de0-8241-4d60-9a57-9609e064b906 and name compute-0.ctlplane.example.com.
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.852 186270 DEBUG nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 23 10:47:37 compute-0 nova_compute[186266]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.852 186270 INFO nova.virt.libvirt.host [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] kernel doesn't support AMD SEV
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.852 186270 DEBUG nova.compute.provider_tree [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.853 186270 DEBUG nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.854 186270 DEBUG nova.virt.libvirt.driver [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Libvirt baseline CPU <cpu>
Feb 23 10:47:37 compute-0 nova_compute[186266]:   <arch>x86_64</arch>
Feb 23 10:47:37 compute-0 nova_compute[186266]:   <model>Nehalem</model>
Feb 23 10:47:37 compute-0 nova_compute[186266]:   <vendor>AMD</vendor>
Feb 23 10:47:37 compute-0 nova_compute[186266]:   <topology sockets="8" cores="1" threads="1"/>
Feb 23 10:47:37 compute-0 nova_compute[186266]: </cpu>
Feb 23 10:47:37 compute-0 nova_compute[186266]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.896 186270 DEBUG nova.scheduler.client.report [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Updated inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.896 186270 DEBUG nova.compute.provider_tree [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.897 186270 DEBUG nova.compute.provider_tree [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.960 186270 DEBUG nova.compute.provider_tree [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.986 186270 DEBUG nova.compute.resource_tracker [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.986 186270 DEBUG oslo_concurrency.lockutils [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:47:37 compute-0 nova_compute[186266]: 2026-02-23 10:47:37.986 186270 DEBUG nova.service [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 23 10:47:38 compute-0 nova_compute[186266]: 2026-02-23 10:47:38.059 186270 DEBUG nova.service [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 23 10:47:38 compute-0 nova_compute[186266]: 2026-02-23 10:47:38.059 186270 DEBUG nova.servicegroup.drivers.db [None req-fce28fa9-9c1d-4981-8e4d-2339a7d44198 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 23 10:47:38 compute-0 sudo[187398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxgdzsdltbpjxesmwfoxhodwbgyhkxgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843657.74768-2636-247726009331500/AnsiballZ_podman_container.py'
Feb 23 10:47:38 compute-0 sudo[187398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:38 compute-0 python3.9[187401]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 23 10:47:38 compute-0 sudo[187398]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:38 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:47:38 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:47:39 compute-0 sudo[187575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwmuowhzbxivgitcpxcczklkylcpvysr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843658.9673603-2652-122941429807484/AnsiballZ_systemd.py'
Feb 23 10:47:39 compute-0 sudo[187575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:39 compute-0 python3.9[187578]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 10:47:39 compute-0 systemd[1]: Stopping nova_compute container...
Feb 23 10:47:40 compute-0 nova_compute[186266]: 2026-02-23 10:47:40.143 186270 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 23 10:47:40 compute-0 nova_compute[186266]: 2026-02-23 10:47:40.147 186270 DEBUG oslo_concurrency.lockutils [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:47:40 compute-0 nova_compute[186266]: 2026-02-23 10:47:40.147 186270 DEBUG oslo_concurrency.lockutils [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:47:40 compute-0 nova_compute[186266]: 2026-02-23 10:47:40.148 186270 DEBUG oslo_concurrency.lockutils [None req-0e2baf18-f89b-49aa-8fc6-29e0b37be872 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:47:40 compute-0 systemd[1]: libpod-1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027.scope: Deactivated successfully.
Feb 23 10:47:40 compute-0 virtqemud[186733]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 23 10:47:40 compute-0 virtqemud[186733]: hostname: compute-0
Feb 23 10:47:40 compute-0 virtqemud[186733]: End of file while reading data: Input/output error
Feb 23 10:47:40 compute-0 systemd[1]: libpod-1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027.scope: Consumed 2.769s CPU time.
Feb 23 10:47:40 compute-0 podman[187582]: 2026-02-23 10:47:40.532896237 +0000 UTC m=+0.931763802 container died 1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 10:47:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027-userdata-shm.mount: Deactivated successfully.
Feb 23 10:47:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b-merged.mount: Deactivated successfully.
Feb 23 10:47:40 compute-0 podman[187582]: 2026-02-23 10:47:40.586021875 +0000 UTC m=+0.984889480 container cleanup 1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:47:40 compute-0 podman[187582]: nova_compute
Feb 23 10:47:40 compute-0 podman[187611]: nova_compute
Feb 23 10:47:40 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 23 10:47:40 compute-0 systemd[1]: Stopped nova_compute container.
Feb 23 10:47:40 compute-0 systemd[1]: Starting nova_compute container...
Feb 23 10:47:40 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:47:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d5e733bafaa4cc196fce0af5ac42f3d8d1cb2172caafa89c8ee50f1900fd91b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:40 compute-0 podman[187624]: 2026-02-23 10:47:40.772079473 +0000 UTC m=+0.090237145 container init 1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:47:40 compute-0 podman[187624]: 2026-02-23 10:47:40.777736775 +0000 UTC m=+0.095894427 container start 1fc3bef0f720b8c223815206bbc85a6eaa5d52fc614f897980a8f627950cd027 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=nova_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute)
Feb 23 10:47:40 compute-0 podman[187624]: nova_compute
Feb 23 10:47:40 compute-0 nova_compute[187639]: + sudo -E kolla_set_configs
Feb 23 10:47:40 compute-0 systemd[1]: Started nova_compute container.
Feb 23 10:47:40 compute-0 sudo[187575]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Validating config file
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying service configuration files
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /etc/ceph
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Creating directory /etc/ceph
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /etc/ceph
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Writing out command to execute
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:40 compute-0 nova_compute[187639]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 10:47:40 compute-0 nova_compute[187639]: ++ cat /run_command
Feb 23 10:47:40 compute-0 nova_compute[187639]: + CMD=nova-compute
Feb 23 10:47:40 compute-0 nova_compute[187639]: + ARGS=
Feb 23 10:47:40 compute-0 nova_compute[187639]: + sudo kolla_copy_cacerts
Feb 23 10:47:40 compute-0 nova_compute[187639]: + [[ ! -n '' ]]
Feb 23 10:47:40 compute-0 nova_compute[187639]: + . kolla_extend_start
Feb 23 10:47:40 compute-0 nova_compute[187639]: Running command: 'nova-compute'
Feb 23 10:47:40 compute-0 nova_compute[187639]: + echo 'Running command: '\''nova-compute'\'''
Feb 23 10:47:40 compute-0 nova_compute[187639]: + umask 0022
Feb 23 10:47:40 compute-0 nova_compute[187639]: + exec nova-compute
Feb 23 10:47:41 compute-0 sshd-session[187675]: Connection closed by authenticating user root 143.198.30.3 port 58754 [preauth]
Feb 23 10:47:41 compute-0 sudo[187802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgkbxbaywbnjzrbqganhcbgtsgasairu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843661.050479-2670-127676338384646/AnsiballZ_podman_container.py'
Feb 23 10:47:41 compute-0 sudo[187802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:41 compute-0 python3.9[187805]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 23 10:47:41 compute-0 systemd[1]: Started libpod-conmon-4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d.scope.
Feb 23 10:47:41 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:47:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d345ffb7d31743add1117dd24dc550b28b0ad17e00e9e29d2cb11812b249ade2/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d345ffb7d31743add1117dd24dc550b28b0ad17e00e9e29d2cb11812b249ade2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d345ffb7d31743add1117dd24dc550b28b0ad17e00e9e29d2cb11812b249ade2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 23 10:47:41 compute-0 podman[187831]: 2026-02-23 10:47:41.743867942 +0000 UTC m=+0.135457350 container init 4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, org.label-schema.build-date=20260216, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 23 10:47:41 compute-0 podman[187831]: 2026-02-23 10:47:41.750109979 +0000 UTC m=+0.141699347 container start 4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 10:47:41 compute-0 python3.9[187805]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Applying nova statedir ownership
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 23 10:47:41 compute-0 nova_compute_init[187852]: INFO:nova_statedir:Nova statedir ownership complete
Feb 23 10:47:41 compute-0 systemd[1]: libpod-4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d.scope: Deactivated successfully.
Feb 23 10:47:41 compute-0 podman[187878]: 2026-02-23 10:47:41.860615518 +0000 UTC m=+0.028445525 container died 4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, org.label-schema.license=GPLv2)
Feb 23 10:47:41 compute-0 sudo[187802]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d-userdata-shm.mount: Deactivated successfully.
Feb 23 10:47:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-d345ffb7d31743add1117dd24dc550b28b0ad17e00e9e29d2cb11812b249ade2-merged.mount: Deactivated successfully.
Feb 23 10:47:41 compute-0 podman[187878]: 2026-02-23 10:47:41.884569972 +0000 UTC m=+0.052399979 container cleanup 4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'bb74c4d67fea6c74cdea956a82630b3b2ebbe7131952570911ca7e973e2b5d0f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 23 10:47:41 compute-0 systemd[1]: libpod-conmon-4f84e856aeb81b65f92da1e975147bd77b6f098e99afb2c1043fdc71eb23778d.scope: Deactivated successfully.
Feb 23 10:47:42 compute-0 nova_compute[187639]: 2026-02-23 10:47:42.540 187643 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 10:47:42 compute-0 nova_compute[187639]: 2026-02-23 10:47:42.540 187643 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 10:47:42 compute-0 nova_compute[187639]: 2026-02-23 10:47:42.540 187643 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 10:47:42 compute-0 nova_compute[187639]: 2026-02-23 10:47:42.541 187643 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 23 10:47:42 compute-0 sshd-session[162631]: Connection closed by 192.168.122.30 port 33018
Feb 23 10:47:42 compute-0 sshd-session[162628]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:47:42 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Feb 23 10:47:42 compute-0 systemd[1]: session-24.scope: Consumed 1min 24.436s CPU time.
Feb 23 10:47:42 compute-0 systemd-logind[808]: Session 24 logged out. Waiting for processes to exit.
Feb 23 10:47:42 compute-0 systemd-logind[808]: Removed session 24.
Feb 23 10:47:42 compute-0 nova_compute[187639]: 2026-02-23 10:47:42.658 187643 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:47:42 compute-0 nova_compute[187639]: 2026-02-23 10:47:42.682 187643 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:47:42 compute-0 nova_compute[187639]: 2026-02-23 10:47:42.682 187643 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.273 187643 INFO nova.virt.driver [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.385 187643 INFO nova.compute.provider_config [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.399 187643 DEBUG oslo_concurrency.lockutils [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.399 187643 DEBUG oslo_concurrency.lockutils [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.399 187643 DEBUG oslo_concurrency.lockutils [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.400 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.400 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.400 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.400 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.400 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.400 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.401 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.401 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.401 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.401 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.401 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.402 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.402 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.402 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.402 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.402 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.402 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.403 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.403 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.403 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.403 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.403 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.403 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.404 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.404 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.404 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.404 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.404 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.404 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.404 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.405 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.405 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.405 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.405 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.405 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.405 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.405 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.406 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.406 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.406 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.406 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.406 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.406 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.406 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.407 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.407 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.407 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.407 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.407 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.407 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.408 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.408 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.408 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.408 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.408 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.408 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.409 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.410 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.411 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.411 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.411 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.411 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.411 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.411 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.411 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.412 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.412 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.412 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.412 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.412 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.412 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.412 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.413 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.413 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.413 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.413 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.413 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.413 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.413 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.414 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.415 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.416 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.416 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.416 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.416 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.416 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.416 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.417 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.417 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.417 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.417 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.417 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.417 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.418 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.418 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.418 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.418 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.418 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.419 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.419 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.419 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.419 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.419 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.419 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.420 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.420 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.420 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.420 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.420 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.420 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.421 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.421 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.421 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.421 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.421 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.422 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.422 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.422 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.422 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.422 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.423 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.423 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.423 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.423 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.423 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.423 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.424 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.424 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.424 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.424 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.424 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.425 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.425 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.425 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.425 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.425 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.426 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.426 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.426 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.426 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.426 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.426 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.426 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.427 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.427 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.427 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.427 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.427 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.427 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.428 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.428 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.428 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.428 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.428 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.429 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.429 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.429 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.429 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.429 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.430 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.430 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.430 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.430 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.430 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.431 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.431 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.431 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.431 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.431 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.431 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.432 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.432 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.432 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.432 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.432 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.433 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.433 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.433 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.433 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.433 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.433 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.434 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.434 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.434 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.434 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.434 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.435 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.436 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.436 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.436 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.436 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.436 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.436 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.436 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.437 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.437 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.437 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.437 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.437 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.437 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.438 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.438 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.438 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.438 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.438 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.438 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.438 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.439 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.439 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.439 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.439 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.439 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.439 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.440 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.440 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.440 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.440 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.440 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.441 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.441 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.441 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.441 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.441 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.441 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.442 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.442 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.442 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.442 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.442 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.443 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.443 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.443 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.443 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.443 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.444 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.444 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.444 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.444 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.444 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.444 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.445 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.445 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.445 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.445 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.445 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.446 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.446 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.446 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.446 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.446 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.446 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.447 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.447 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.447 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.447 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.447 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.447 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.448 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.448 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.448 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.448 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.448 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.449 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.449 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.449 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.449 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.449 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.450 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.450 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.450 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.450 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.450 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.451 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.451 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.451 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.451 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.451 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.451 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.452 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.452 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.452 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.452 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.452 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.453 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.453 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.453 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.453 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.453 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.454 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.454 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.454 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.454 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.454 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.454 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.455 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.455 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.455 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.455 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.455 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.455 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.456 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.456 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.456 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.456 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.456 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.456 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.457 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.457 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.457 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.457 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.457 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.457 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.458 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.458 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.458 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.458 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.458 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.458 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.458 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.459 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.459 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.459 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.459 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.460 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.460 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.460 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.460 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.460 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.460 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.460 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.461 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.461 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.461 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.461 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.461 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.461 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.461 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.462 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.463 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.463 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.463 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.463 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.463 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.463 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.463 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.464 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.464 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.464 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.464 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.464 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.464 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.465 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.465 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.465 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.465 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.465 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.466 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.466 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.466 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.466 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.466 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.466 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.466 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.467 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.468 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.468 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.468 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.468 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.468 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.468 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.468 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.469 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.469 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.469 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.469 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.469 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.469 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.469 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.470 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.470 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.470 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.470 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.470 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.470 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.470 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.471 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.472 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.472 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.472 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.472 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.472 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.472 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.472 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.473 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.473 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.473 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.473 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.473 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.474 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.474 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.474 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.474 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.474 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.474 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.475 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.476 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.476 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.476 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.476 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.476 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.476 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.476 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.477 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.477 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.477 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.477 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.477 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.477 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.477 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.478 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.479 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.479 187643 WARNING oslo_config.cfg [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 23 10:47:43 compute-0 nova_compute[187639]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 23 10:47:43 compute-0 nova_compute[187639]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 23 10:47:43 compute-0 nova_compute[187639]: and ``live_migration_inbound_addr`` respectively.
Feb 23 10:47:43 compute-0 nova_compute[187639]: ).  Its value may be silently ignored in the future.
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.479 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.479 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.479 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.479 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.480 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.480 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.480 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.480 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.480 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.480 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.480 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.481 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.481 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.481 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.481 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.481 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.481 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.481 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.482 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.482 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.482 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.482 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.482 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.482 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.482 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.483 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.483 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.483 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.483 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.483 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.483 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.483 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.484 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.484 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.484 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.484 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.484 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.484 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.484 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.485 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.485 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.485 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.485 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.485 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.485 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.485 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.486 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.486 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.486 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.486 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.486 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.486 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.486 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.487 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.487 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.487 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.487 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.487 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.487 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.487 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.488 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.489 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.489 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.489 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.489 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.489 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.489 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.489 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.490 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.490 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.490 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.490 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.490 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.490 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.490 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.491 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.491 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.491 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.491 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.491 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.491 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.491 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.492 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.492 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.492 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.492 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.492 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.492 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.492 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.493 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.494 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.494 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.494 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.494 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.494 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.494 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.495 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.495 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.495 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.495 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.495 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.495 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.496 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.497 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.497 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.497 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.497 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.497 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.497 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.497 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.498 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.498 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.498 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.498 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.498 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.498 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.498 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.499 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.499 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.499 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.499 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.499 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.499 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.499 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.500 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.500 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.500 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.500 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.500 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.500 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.501 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.501 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.501 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.501 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.501 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.501 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.501 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.502 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.502 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.502 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.502 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.502 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.502 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.502 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.503 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.503 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.503 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.503 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.503 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.503 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.503 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.504 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.504 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.504 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.504 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.504 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.504 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.504 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.505 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.505 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.505 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.505 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.505 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.505 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.505 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.506 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.506 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.506 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.506 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.506 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.506 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.506 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.507 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.507 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.507 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.507 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.507 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.507 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.507 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.508 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.508 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.508 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.508 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.508 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.508 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.508 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.509 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.510 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.510 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.510 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.510 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.510 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.510 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.510 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.511 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.511 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.511 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.511 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.511 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.511 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.511 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.512 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.512 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.512 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.512 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.512 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.512 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.512 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.513 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.513 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.513 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.513 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.513 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.513 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.514 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.514 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.514 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.514 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.514 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.514 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.514 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.515 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.515 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.515 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.515 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.515 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.515 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.516 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.516 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.516 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.516 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.516 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.516 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.516 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.517 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.518 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.518 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.518 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.518 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.518 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.518 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.518 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.519 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.520 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.520 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.520 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.520 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.520 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.520 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.520 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.521 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.521 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.521 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.521 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.521 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.521 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.521 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.522 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.522 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.522 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.522 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.522 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.522 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.522 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.523 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.523 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.523 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.523 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.523 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.523 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.523 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.524 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.524 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.524 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.524 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.524 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.524 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.524 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.525 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.525 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.525 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.525 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.525 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.525 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.525 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.526 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.526 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.526 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.526 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.526 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.527 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.527 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.527 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.527 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.527 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.527 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.527 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.528 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.528 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.528 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.528 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.528 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.528 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.529 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.529 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.529 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.529 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.529 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.529 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.530 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.530 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.530 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.530 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.530 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.530 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.530 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.531 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.531 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.531 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.531 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.531 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.531 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.531 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.532 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.532 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.532 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.532 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.532 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.532 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.532 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.533 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.533 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.533 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.533 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.533 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.533 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.534 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.534 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.534 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.534 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.534 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.534 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.534 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.535 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.535 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.535 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.535 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.535 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.535 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.535 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.536 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.536 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.536 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.536 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.536 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.536 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.536 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.537 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.537 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.537 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.537 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.537 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.537 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.537 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.538 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.539 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.539 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.539 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.539 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.539 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.539 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.539 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.540 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.540 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.540 187643 DEBUG oslo_service.service [None req-6301bde7-f2d8-49af-b936-a74196daa9ce - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.541 187643 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.554 187643 INFO nova.virt.node [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Determined node identity 8ecb3de0-8241-4d60-9a57-9609e064b906 from /var/lib/nova/compute_id
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.554 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.555 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.555 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.555 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.565 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fec79409070> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.566 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fec79409070> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.567 187643 INFO nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Connection event '1' reason 'None'
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.571 187643 INFO nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Libvirt host capabilities <capabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]: 
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <host>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <uuid>07d22930-8d23-4ccf-b924-bb75b3355502</uuid>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <arch>x86_64</arch>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model>EPYC-Rome-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <vendor>AMD</vendor>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <microcode version='16777317'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <signature family='23' model='49' stepping='0'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='x2apic'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='tsc-deadline'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='osxsave'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='hypervisor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='tsc_adjust'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='spec-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='stibp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='arch-capabilities'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='cmp_legacy'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='topoext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='virt-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='lbrv'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='tsc-scale'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='vmcb-clean'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='pause-filter'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='pfthreshold'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='svme-addr-chk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='rdctl-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='skip-l1dfl-vmentry'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='mds-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature name='pschange-mc-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <pages unit='KiB' size='4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <pages unit='KiB' size='2048'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <pages unit='KiB' size='1048576'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <power_management>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <suspend_mem/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <suspend_disk/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <suspend_hybrid/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </power_management>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <iommu support='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <migration_features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <live/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <uri_transports>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <uri_transport>tcp</uri_transport>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <uri_transport>rdma</uri_transport>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </uri_transports>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </migration_features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <topology>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <cells num='1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <cell id='0'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           <memory unit='KiB'>7864280</memory>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           <pages unit='KiB' size='4'>1966070</pages>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           <pages unit='KiB' size='2048'>0</pages>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           <distances>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <sibling id='0' value='10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           </distances>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           <cpus num='8'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:           </cpus>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         </cell>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </cells>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </topology>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <cache>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </cache>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <secmodel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model>selinux</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <doi>0</doi>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </secmodel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <secmodel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model>dac</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <doi>0</doi>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </secmodel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </host>
Feb 23 10:47:43 compute-0 nova_compute[187639]: 
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <guest>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <os_type>hvm</os_type>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <arch name='i686'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <wordsize>32</wordsize>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <domain type='qemu'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <domain type='kvm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </arch>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <pae/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <nonpae/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <acpi default='on' toggle='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <apic default='on' toggle='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <cpuselection/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <deviceboot/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <disksnapshot default='on' toggle='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <externalSnapshot/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </guest>
Feb 23 10:47:43 compute-0 nova_compute[187639]: 
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <guest>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <os_type>hvm</os_type>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <arch name='x86_64'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <wordsize>64</wordsize>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <domain type='qemu'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <domain type='kvm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </arch>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <acpi default='on' toggle='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <apic default='on' toggle='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <cpuselection/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <deviceboot/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <disksnapshot default='on' toggle='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <externalSnapshot/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </guest>
Feb 23 10:47:43 compute-0 nova_compute[187639]: 
Feb 23 10:47:43 compute-0 nova_compute[187639]: </capabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]: 
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.578 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.582 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 23 10:47:43 compute-0 nova_compute[187639]: <domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <domain>kvm</domain>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <arch>i686</arch>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <vcpu max='240'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <iothreads supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <os supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='firmware'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <loader supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>rom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pflash</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='readonly'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>yes</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='secure'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </loader>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </os>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='maximumMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <vendor>AMD</vendor>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='succor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='custom' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <memoryBacking supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='sourceType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>anonymous</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>memfd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </memoryBacking>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <disk supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='diskDevice'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>disk</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cdrom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>floppy</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>lun</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ide</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>fdc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>sata</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <graphics supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vnc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egl-headless</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </graphics>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <video supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='modelType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vga</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cirrus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>none</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>bochs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ramfb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </video>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hostdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='mode'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>subsystem</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='startupPolicy'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>mandatory</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>requisite</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>optional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='subsysType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pci</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='capsType'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='pciBackend'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hostdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <rng supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>random</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <filesystem supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='driverType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>path</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>handle</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtiofs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </filesystem>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tpm supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-tis</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-crb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emulator</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>external</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendVersion'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>2.0</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </tpm>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <redirdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </redirdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <channel supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </channel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <crypto supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </crypto>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <interface supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>passt</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <panic supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>isa</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>hyperv</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </panic>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <console supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>null</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dev</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pipe</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stdio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>udp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tcp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu-vdagent</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </console>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <gic supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <genid supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backup supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <async-teardown supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <s390-pv supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <ps2 supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tdx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sev supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sgx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hyperv supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='features'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>relaxed</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vapic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>spinlocks</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vpindex</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>runtime</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>synic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stimer</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reset</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vendor_id</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>frequencies</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reenlightenment</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tlbflush</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ipi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>avic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emsr_bitmap</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>xmm_input</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hyperv>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <launchSecurity supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </features>
Feb 23 10:47:43 compute-0 nova_compute[187639]: </domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.585 187643 DEBUG nova.virt.libvirt.volume.mount [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.587 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 23 10:47:43 compute-0 nova_compute[187639]: <domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <domain>kvm</domain>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <arch>i686</arch>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <vcpu max='4096'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <iothreads supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <os supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='firmware'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <loader supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>rom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pflash</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='readonly'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>yes</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='secure'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </loader>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </os>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='maximumMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <vendor>AMD</vendor>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='succor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='custom' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <memoryBacking supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='sourceType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>anonymous</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>memfd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </memoryBacking>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <disk supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='diskDevice'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>disk</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cdrom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>floppy</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>lun</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>fdc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>sata</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <graphics supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vnc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egl-headless</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </graphics>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <video supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='modelType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vga</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cirrus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>none</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>bochs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ramfb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </video>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hostdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='mode'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>subsystem</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='startupPolicy'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>mandatory</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>requisite</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>optional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='subsysType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pci</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='capsType'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='pciBackend'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hostdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <rng supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>random</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <filesystem supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='driverType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>path</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>handle</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtiofs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </filesystem>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tpm supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-tis</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-crb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emulator</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>external</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendVersion'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>2.0</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </tpm>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <redirdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </redirdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <channel supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </channel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <crypto supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </crypto>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <interface supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>passt</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <panic supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>isa</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>hyperv</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </panic>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <console supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>null</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dev</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pipe</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stdio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>udp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tcp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu-vdagent</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </console>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <gic supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <genid supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backup supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <async-teardown supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <s390-pv supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <ps2 supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tdx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sev supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sgx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hyperv supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='features'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>relaxed</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vapic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>spinlocks</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vpindex</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>runtime</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>synic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stimer</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reset</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vendor_id</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>frequencies</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reenlightenment</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tlbflush</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ipi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>avic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emsr_bitmap</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>xmm_input</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hyperv>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <launchSecurity supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </features>
Feb 23 10:47:43 compute-0 nova_compute[187639]: </domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.639 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.643 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 23 10:47:43 compute-0 nova_compute[187639]: <domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <domain>kvm</domain>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <arch>x86_64</arch>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <vcpu max='240'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <iothreads supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <os supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='firmware'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <loader supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>rom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pflash</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='readonly'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>yes</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='secure'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </loader>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </os>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='maximumMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <vendor>AMD</vendor>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='succor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='custom' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <memoryBacking supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='sourceType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>anonymous</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>memfd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </memoryBacking>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <disk supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='diskDevice'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>disk</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cdrom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>floppy</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>lun</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ide</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>fdc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>sata</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <graphics supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vnc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egl-headless</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </graphics>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <video supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='modelType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vga</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cirrus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>none</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>bochs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ramfb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </video>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hostdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='mode'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>subsystem</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='startupPolicy'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>mandatory</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>requisite</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>optional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='subsysType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pci</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='capsType'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='pciBackend'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hostdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <rng supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>random</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <filesystem supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='driverType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>path</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>handle</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtiofs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </filesystem>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tpm supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-tis</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-crb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emulator</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>external</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendVersion'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>2.0</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </tpm>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <redirdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </redirdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <channel supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </channel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <crypto supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </crypto>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <interface supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>passt</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <panic supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>isa</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>hyperv</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </panic>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <console supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>null</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dev</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pipe</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stdio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>udp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tcp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu-vdagent</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </console>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <gic supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <genid supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backup supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <async-teardown supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <s390-pv supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <ps2 supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tdx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sev supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sgx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hyperv supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='features'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>relaxed</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vapic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>spinlocks</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vpindex</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>runtime</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>synic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stimer</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reset</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vendor_id</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>frequencies</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reenlightenment</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tlbflush</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ipi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>avic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emsr_bitmap</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>xmm_input</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hyperv>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <launchSecurity supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </features>
Feb 23 10:47:43 compute-0 nova_compute[187639]: </domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.715 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 23 10:47:43 compute-0 nova_compute[187639]: <domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <domain>kvm</domain>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <arch>x86_64</arch>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <vcpu max='4096'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <iothreads supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <os supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='firmware'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>efi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <loader supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>rom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pflash</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='readonly'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>yes</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='secure'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>yes</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>no</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </loader>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </os>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-passthrough' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='hostPassthroughMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='maximum' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='maximumMigratable'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>on</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>off</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='host-model' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <vendor>AMD</vendor>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='x2apic'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='hypervisor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='stibp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='overflow-recov'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='succor'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lbrv'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='tsc-scale'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='flushbyasid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pause-filter'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='pfthreshold'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <feature policy='disable' name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <mode name='custom' supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Broadwell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='ClearwaterForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ddpd-u'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sha512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm3'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sm4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Cooperlake-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Denverton-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Dhyana-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Milan-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Rome-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-Turin-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amd-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='auto-ibrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vp2intersect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fs-gs-base-ns'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibpb-brtype'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='no-nested-data-bp'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='null-sel-clr-base'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='perfmon-v2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbpb'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='srso-user-kernel-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='stibp-always-on'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='EPYC-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='GraniteRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-128'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-256'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx10-512'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='prefetchiti'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Haswell-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v6'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Icelake-Server-v7'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='IvyBridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='KnightsMill-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4fmaps'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-4vnniw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512er'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512pf'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G4-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Opteron_G5-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fma4'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tbm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xop'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SapphireRapids-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='amx-tile'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-bf16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-fp16'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512-vpopcntdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bitalg'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vbmi2'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrc'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fzrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='la57'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='taa-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='tsx-ldtrk'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='SierraForest-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ifma'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-ne-convert'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx-vnni-int8'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bhi-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='bus-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cmpccxadd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fbsdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='fsrs'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ibrs-all'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='intel-psfd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ipred-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='lam'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mcdt-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pbrsb-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='psdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rrsba-ctrl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='sbdr-ssdp-no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='serialize'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vaes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='vpclmulqdq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Client-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='hle'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='rtm'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Skylake-Server-v5'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512bw'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512cd'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512dq'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512f'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='avx512vl'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='invpcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pcid'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='pku'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='mpx'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v2'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v3'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='core-capability'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='split-lock-detect'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='Snowridge-v4'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='cldemote'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='erms'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='gfni'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdir64b'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='movdiri'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='xsaves'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='athlon-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='core2duo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='coreduo-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='n270-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='ss'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <blockers model='phenom-v1'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnow'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <feature name='3dnowext'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </blockers>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </mode>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <memoryBacking supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <enum name='sourceType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>anonymous</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <value>memfd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </memoryBacking>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <disk supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='diskDevice'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>disk</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cdrom</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>floppy</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>lun</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>fdc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>sata</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <graphics supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vnc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egl-headless</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </graphics>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <video supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='modelType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vga</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>cirrus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>none</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>bochs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ramfb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </video>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hostdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='mode'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>subsystem</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='startupPolicy'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>mandatory</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>requisite</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>optional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='subsysType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pci</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>scsi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='capsType'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='pciBackend'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hostdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <rng supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtio-non-transitional</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>random</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>egd</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <filesystem supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='driverType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>path</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>handle</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>virtiofs</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </filesystem>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tpm supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-tis</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tpm-crb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emulator</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>external</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendVersion'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>2.0</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </tpm>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <redirdev supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='bus'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>usb</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </redirdev>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <channel supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </channel>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <crypto supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendModel'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>builtin</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </crypto>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <interface supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='backendType'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>default</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>passt</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <panic supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='model'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>isa</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>hyperv</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </panic>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <console supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='type'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>null</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vc</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pty</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dev</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>file</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>pipe</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stdio</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>udp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tcp</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>unix</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>qemu-vdagent</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>dbus</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </console>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <features>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <gic supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <vmcoreinfo supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <genid supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backingStoreInput supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <backup supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <async-teardown supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <s390-pv supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <ps2 supported='yes'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <tdx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sev supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <sgx supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <hyperv supported='yes'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <enum name='features'>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>relaxed</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vapic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>spinlocks</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vpindex</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>runtime</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>synic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>stimer</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reset</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>vendor_id</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>frequencies</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>reenlightenment</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>tlbflush</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>ipi</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>avic</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>emsr_bitmap</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <value>xmm_input</value>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </enum>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       <defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <spinlocks>4095</spinlocks>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <stimer_direct>on</stimer_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_direct>on</tlbflush_direct>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <tlbflush_extended>on</tlbflush_extended>
Feb 23 10:47:43 compute-0 nova_compute[187639]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 10:47:43 compute-0 nova_compute[187639]:       </defaults>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     </hyperv>
Feb 23 10:47:43 compute-0 nova_compute[187639]:     <launchSecurity supported='no'/>
Feb 23 10:47:43 compute-0 nova_compute[187639]:   </features>
Feb 23 10:47:43 compute-0 nova_compute[187639]: </domainCapabilities>
Feb 23 10:47:43 compute-0 nova_compute[187639]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.779 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.779 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.780 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.784 187643 INFO nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Secure Boot support detected
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.787 187643 INFO nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.787 187643 INFO nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.794 187643 DEBUG nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] cpu compare xml: <cpu match="exact">
Feb 23 10:47:43 compute-0 nova_compute[187639]:   <model>Nehalem</model>
Feb 23 10:47:43 compute-0 nova_compute[187639]: </cpu>
Feb 23 10:47:43 compute-0 nova_compute[187639]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.796 187643 DEBUG nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.895 187643 INFO nova.virt.node [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Determined node identity 8ecb3de0-8241-4d60-9a57-9609e064b906 from /var/lib/nova/compute_id
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.958 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Verified node 8ecb3de0-8241-4d60-9a57-9609e064b906 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 23 10:47:43 compute-0 nova_compute[187639]: 2026-02-23 10:47:43.995 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.169 187643 DEBUG oslo_concurrency.lockutils [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.170 187643 DEBUG oslo_concurrency.lockutils [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.170 187643 DEBUG oslo_concurrency.lockutils [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.171 187643 DEBUG nova.compute.resource_tracker [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:47:44 compute-0 rsyslogd[1017]: imjournal from <np0005626601:nova_compute>: begin to drop messages due to rate-limiting
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.319 187643 WARNING nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.320 187643 DEBUG nova.compute.resource_tracker [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6147MB free_disk=73.43547821044922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.320 187643 DEBUG oslo_concurrency.lockutils [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.321 187643 DEBUG oslo_concurrency.lockutils [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.456 187643 DEBUG nova.compute.resource_tracker [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.456 187643 DEBUG nova.compute.resource_tracker [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.474 187643 DEBUG nova.scheduler.client.report [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.530 187643 DEBUG nova.scheduler.client.report [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.530 187643 DEBUG nova.compute.provider_tree [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.546 187643 DEBUG nova.scheduler.client.report [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.574 187643 DEBUG nova.scheduler.client.report [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.591 187643 DEBUG nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 23 10:47:44 compute-0 nova_compute[187639]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.591 187643 INFO nova.virt.libvirt.host [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] kernel doesn't support AMD SEV
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.592 187643 DEBUG nova.compute.provider_tree [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.593 187643 DEBUG nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.596 187643 DEBUG nova.virt.libvirt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Libvirt baseline CPU <cpu>
Feb 23 10:47:44 compute-0 nova_compute[187639]:   <arch>x86_64</arch>
Feb 23 10:47:44 compute-0 nova_compute[187639]:   <model>Nehalem</model>
Feb 23 10:47:44 compute-0 nova_compute[187639]:   <vendor>AMD</vendor>
Feb 23 10:47:44 compute-0 nova_compute[187639]:   <topology sockets="8" cores="1" threads="1"/>
Feb 23 10:47:44 compute-0 nova_compute[187639]: </cpu>
Feb 23 10:47:44 compute-0 nova_compute[187639]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.623 187643 DEBUG nova.scheduler.client.report [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.688 187643 DEBUG nova.compute.resource_tracker [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.688 187643 DEBUG oslo_concurrency.lockutils [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.688 187643 DEBUG nova.service [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.725 187643 DEBUG nova.service [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 23 10:47:44 compute-0 nova_compute[187639]: 2026-02-23 10:47:44.726 187643 DEBUG nova.servicegroup.drivers.db [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 23 10:47:45 compute-0 podman[187942]: 2026-02-23 10:47:45.886956261 +0000 UTC m=+0.079563488 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 10:47:48 compute-0 sshd-session[187964]: Accepted publickey for zuul from 192.168.122.30 port 42752 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 10:47:48 compute-0 systemd-logind[808]: New session 26 of user zuul.
Feb 23 10:47:48 compute-0 systemd[1]: Started Session 26 of User zuul.
Feb 23 10:47:48 compute-0 sshd-session[187964]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 10:47:49 compute-0 python3.9[188117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 10:47:49 compute-0 podman[188169]: 2026-02-23 10:47:49.875505128 +0000 UTC m=+0.073439864 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:47:50 compute-0 sudo[188298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfcgpcezwkkfnnwdhtlfkakxtxjaylsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843669.776953-47-257462764180425/AnsiballZ_systemd_service.py'
Feb 23 10:47:50 compute-0 sudo[188298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:50 compute-0 python3.9[188301]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:47:50 compute-0 systemd[1]: Reloading.
Feb 23 10:47:50 compute-0 systemd-sysv-generator[188333]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:47:50 compute-0 systemd-rc-local-generator[188324]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:47:50 compute-0 nova_compute[187639]: 2026-02-23 10:47:50.728 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:47:50 compute-0 nova_compute[187639]: 2026-02-23 10:47:50.748 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:47:50 compute-0 sudo[188298]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:51 compute-0 python3.9[188493]: ansible-ansible.builtin.service_facts Invoked
Feb 23 10:47:51 compute-0 network[188510]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 10:47:51 compute-0 network[188511]: 'network-scripts' will be removed from distribution in near future.
Feb 23 10:47:51 compute-0 network[188512]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 10:47:54 compute-0 sudo[188783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdnhwbsuhtztfoigalafjteyzcusrlyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843674.606856-85-240607507479532/AnsiballZ_systemd_service.py'
Feb 23 10:47:54 compute-0 sudo[188783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:55 compute-0 python3.9[188786]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:47:55 compute-0 sudo[188783]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:55 compute-0 sudo[188937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhfxizqmpybbwvsflttkgpguqrvqdyxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843675.5249343-105-155125513355603/AnsiballZ_file.py'
Feb 23 10:47:55 compute-0 sudo[188937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:56 compute-0 python3.9[188940]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:56 compute-0 sudo[188937]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:56 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:47:56 compute-0 sudo[189091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nldgxglzyulcsbeaysezqybdpjplbspa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843676.3057542-121-1718437745656/AnsiballZ_file.py'
Feb 23 10:47:56 compute-0 sudo[189091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:56 compute-0 python3.9[189094]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:47:56 compute-0 sudo[189091]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:57 compute-0 sudo[189244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdeixcvqsdmacbjylidrgivwsjwliftt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843677.225071-139-29573069606100/AnsiballZ_command.py'
Feb 23 10:47:57 compute-0 sudo[189244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:57 compute-0 python3.9[189247]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:47:57 compute-0 sudo[189244]: pam_unix(sudo:session): session closed for user root
Feb 23 10:47:58 compute-0 python3.9[189399]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 10:47:59 compute-0 sudo[189549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtikphehqiljavpxhhezeodbqxakgtpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843678.9260876-175-72306955528801/AnsiballZ_systemd_service.py'
Feb 23 10:47:59 compute-0 sudo[189549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:47:59 compute-0 python3.9[189552]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:47:59 compute-0 systemd[1]: Reloading.
Feb 23 10:47:59 compute-0 systemd-rc-local-generator[189577]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:47:59 compute-0 systemd-sysv-generator[189582]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:47:59 compute-0 sudo[189549]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:00 compute-0 sudo[189744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlhhlqvbghoqemdnkrabaxjfzfuaixnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843679.885664-191-144988533517004/AnsiballZ_command.py'
Feb 23 10:48:00 compute-0 sudo[189744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:00 compute-0 python3.9[189747]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:48:00 compute-0 sudo[189744]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:00 compute-0 sudo[189898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phncihoxuavhggtviykegqtehkyzogvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843680.5180562-209-196505490229662/AnsiballZ_file.py'
Feb 23 10:48:00 compute-0 sudo[189898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:00 compute-0 python3.9[189901]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:01 compute-0 sudo[189898]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:01 compute-0 python3.9[190051]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:02 compute-0 sudo[190203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtncncumsrxttewwjnlemwwfqmntbnmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843682.0858605-241-184017327255003/AnsiballZ_group.py'
Feb 23 10:48:02 compute-0 sudo[190203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:02 compute-0 python3.9[190206]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 23 10:48:02 compute-0 sudo[190203]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:03 compute-0 sudo[190356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hepdmwghgbkqptqvjggerhezdtezzmht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843683.0802937-263-149628436754506/AnsiballZ_getent.py'
Feb 23 10:48:03 compute-0 sudo[190356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:03 compute-0 python3.9[190359]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 23 10:48:03 compute-0 sudo[190356]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:04 compute-0 sudo[190510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfrikonlvwbasdeuvpunbuyejvoqqhli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843683.9158206-279-14694968232389/AnsiballZ_group.py'
Feb 23 10:48:04 compute-0 sudo[190510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:04 compute-0 python3.9[190513]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 10:48:04 compute-0 groupadd[190514]: group added to /etc/group: name=ceilometer, GID=42405
Feb 23 10:48:04 compute-0 groupadd[190514]: group added to /etc/gshadow: name=ceilometer
Feb 23 10:48:04 compute-0 groupadd[190514]: new group: name=ceilometer, GID=42405
Feb 23 10:48:04 compute-0 sudo[190510]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:05 compute-0 sudo[190669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-albutvekerklbhjqnbzzgrxephfzjdix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843684.5999157-295-210171258621842/AnsiballZ_user.py'
Feb 23 10:48:05 compute-0 sudo[190669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:05 compute-0 python3.9[190672]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 10:48:05 compute-0 useradd[190674]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Feb 23 10:48:05 compute-0 useradd[190674]: add 'ceilometer' to group 'libvirt'
Feb 23 10:48:05 compute-0 useradd[190674]: add 'ceilometer' to shadow group 'libvirt'
Feb 23 10:48:05 compute-0 sudo[190669]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:06 compute-0 python3.9[190830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:07 compute-0 python3.9[190951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771843686.2178998-347-12270523596206/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:07 compute-0 python3.9[191101]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:08 compute-0 python3.9[191222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771843687.4474003-347-234256232059610/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:08 compute-0 python3.9[191372]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:09 compute-0 python3.9[191493]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771843688.4756873-347-25294504959464/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:10 compute-0 python3.9[191643]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:10 compute-0 python3.9[191795]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:11 compute-0 python3.9[191947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:11 compute-0 sshd-session[191948]: Connection closed by authenticating user root 165.227.79.48 port 47020 [preauth]
Feb 23 10:48:12 compute-0 python3.9[192070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843691.1317031-465-250923811674493/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:48:12.627 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:48:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:48:12.628 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:48:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:48:12.629 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:48:12 compute-0 python3.9[192220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:13 compute-0 python3.9[192341]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843692.4425912-465-227352100599398/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:13 compute-0 sshd-session[192473]: Connection closed by authenticating user root 143.198.30.3 port 42496 [preauth]
Feb 23 10:48:14 compute-0 python3.9[192493]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:14 compute-0 python3.9[192614]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843693.6511443-523-217662028968879/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:15 compute-0 python3.9[192764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:15 compute-0 python3.9[192885]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843695.0723863-555-122771101725997/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:16 compute-0 podman[192886]: 2026-02-23 10:48:16.024491806 +0000 UTC m=+0.038309791 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 10:48:16 compute-0 python3.9[193054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:17 compute-0 python3.9[193175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843696.2164094-585-205026266017469/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:17 compute-0 python3.9[193325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:18 compute-0 python3.9[193446]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843697.5284357-615-100965668424103/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:18 compute-0 sudo[193596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgaywnnfxbholmufpimiwqqxkmnbofpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843698.627691-645-42703765508162/AnsiballZ_file.py'
Feb 23 10:48:18 compute-0 sudo[193596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:19 compute-0 python3.9[193599]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:19 compute-0 sudo[193596]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:19 compute-0 sudo[193749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myxgqwoystbgehneguwwahpcpfocpmgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843699.2453742-661-68843998128710/AnsiballZ_file.py'
Feb 23 10:48:19 compute-0 sudo[193749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:19 compute-0 python3.9[193752]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:19 compute-0 sudo[193749]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:20 compute-0 podman[193876]: 2026-02-23 10:48:20.180813699 +0000 UTC m=+0.106880072 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb 23 10:48:20 compute-0 python3.9[193912]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:20 compute-0 python3.9[194077]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:21 compute-0 python3.9[194229]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:22 compute-0 sudo[194381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srewwlcgkumjdneuddpmxtxxcyaazyrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843701.891865-725-77959961976339/AnsiballZ_file.py'
Feb 23 10:48:22 compute-0 sudo[194381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:22 compute-0 python3.9[194384]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:22 compute-0 sudo[194381]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:22 compute-0 sudo[194534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olxubdhcipsnoupdmmoimhkpdotzvedx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843702.5243502-741-264059256242517/AnsiballZ_systemd_service.py'
Feb 23 10:48:22 compute-0 sudo[194534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:23 compute-0 python3.9[194537]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:48:23 compute-0 systemd[1]: Reloading.
Feb 23 10:48:23 compute-0 systemd-sysv-generator[194570]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:48:23 compute-0 systemd-rc-local-generator[194566]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:48:23 compute-0 systemd[1]: Listening on Podman API Socket.
Feb 23 10:48:23 compute-0 sudo[194534]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:24 compute-0 sudo[194732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axnjutihzlcfrpcixdzquygotqupjmkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843703.7979584-759-167750008491385/AnsiballZ_stat.py'
Feb 23 10:48:24 compute-0 sudo[194732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:24 compute-0 python3.9[194735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:24 compute-0 sudo[194732]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:24 compute-0 sudo[194856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzpebdozgrbwokcvrkauczgbtpmclcal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843703.7979584-759-167750008491385/AnsiballZ_copy.py'
Feb 23 10:48:24 compute-0 sudo[194856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:24 compute-0 python3.9[194859]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843703.7979584-759-167750008491385/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:24 compute-0 sudo[194856]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:25 compute-0 sudo[195009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fejvvqkmzxfqwssnrfbxxlybsdbgrwvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843705.537826-801-190353804097462/AnsiballZ_file.py'
Feb 23 10:48:25 compute-0 sudo[195009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:25 compute-0 python3.9[195012]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:26 compute-0 sudo[195009]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:26 compute-0 sudo[195162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtgbeswuqasaliponfyrvobpokvvhdtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843706.2941551-817-38511528771810/AnsiballZ_file.py'
Feb 23 10:48:26 compute-0 sudo[195162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:26 compute-0 python3.9[195165]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:26 compute-0 sudo[195162]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:27 compute-0 python3.9[195315]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:29 compute-0 sudo[195736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrbnpxyveluzfetukgmqarznzayffrhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843709.171856-885-98175287981244/AnsiballZ_container_config_data.py'
Feb 23 10:48:29 compute-0 sudo[195736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:29 compute-0 python3.9[195739]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 23 10:48:29 compute-0 sudo[195736]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:30 compute-0 sudo[195889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjcuccfagvemxqwaqizxzsyrcmiiflbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843710.3180223-907-210008113895953/AnsiballZ_container_config_hash.py'
Feb 23 10:48:30 compute-0 sudo[195889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:30 compute-0 python3.9[195892]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 10:48:30 compute-0 sudo[195889]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:31 compute-0 sudo[196042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npppfoliaxzxzsdojowfdulsrwmlezcc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843711.3626676-927-128066348997375/AnsiballZ_edpm_container_manage.py'
Feb 23 10:48:31 compute-0 sudo[196042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:32 compute-0 python3[196045]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 10:48:33 compute-0 podman[196057]: 2026-02-23 10:48:33.181064916 +0000 UTC m=+1.061199922 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 23 10:48:33 compute-0 podman[196151]: 2026-02-23 10:48:33.275098692 +0000 UTC m=+0.033909592 container create d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:48:33 compute-0 podman[196151]: 2026-02-23 10:48:33.257310044 +0000 UTC m=+0.016120974 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 23 10:48:33 compute-0 python3[196045]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 23 10:48:33 compute-0 sudo[196042]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:33 compute-0 sudo[196339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otltnovenwknhhozmbbmdnnusfacnpua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843713.716379-943-270103181619205/AnsiballZ_stat.py'
Feb 23 10:48:33 compute-0 sudo[196339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:34 compute-0 python3.9[196342]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:34 compute-0 sudo[196339]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:34 compute-0 sudo[196494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qngypybjwwdgzxvaasymojdaalmuttro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843714.5248222-961-28084073963476/AnsiballZ_file.py'
Feb 23 10:48:34 compute-0 sudo[196494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:34 compute-0 python3.9[196497]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:34 compute-0 sudo[196494]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:35 compute-0 sudo[196571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfzhiqjepvabmjmxaekehvmxffwtokwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843714.5248222-961-28084073963476/AnsiballZ_stat.py'
Feb 23 10:48:35 compute-0 sudo[196571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:35 compute-0 python3.9[196574]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:35 compute-0 sudo[196571]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:35 compute-0 sudo[196723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxmsbqshbiyisengldaxgfwtvncffgim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843715.4031858-961-272494421333561/AnsiballZ_copy.py'
Feb 23 10:48:35 compute-0 sudo[196723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:36 compute-0 python3.9[196726]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771843715.4031858-961-272494421333561/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:36 compute-0 sudo[196723]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:36 compute-0 sudo[196800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulttabfyseijzelqrvjiocrjyeyfyoti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843715.4031858-961-272494421333561/AnsiballZ_systemd.py'
Feb 23 10:48:36 compute-0 sudo[196800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:36 compute-0 python3.9[196803]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:48:36 compute-0 systemd[1]: Reloading.
Feb 23 10:48:36 compute-0 systemd-rc-local-generator[196831]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:48:36 compute-0 systemd-sysv-generator[196835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:48:37 compute-0 sudo[196800]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:37 compute-0 sudo[196920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioliwkufweptbaujggzgmksfxoatzcdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843715.4031858-961-272494421333561/AnsiballZ_systemd.py'
Feb 23 10:48:37 compute-0 sudo[196920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:37 compute-0 python3.9[196923]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:48:37 compute-0 systemd[1]: Reloading.
Feb 23 10:48:37 compute-0 systemd-rc-local-generator[196948]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:48:37 compute-0 systemd-sysv-generator[196953]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:48:38 compute-0 systemd[1]: Starting podman_exporter container...
Feb 23 10:48:38 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:48:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e8945de3ba720ac125cdc121913d94c9a5caf3e16c67a424f19074c9c7cf72/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 23 10:48:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e8945de3ba720ac125cdc121913d94c9a5caf3e16c67a424f19074c9c7cf72/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 23 10:48:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771.
Feb 23 10:48:38 compute-0 podman[196969]: 2026-02-23 10:48:38.137890675 +0000 UTC m=+0.109894033 container init d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:48:38 compute-0 podman_exporter[196985]: ts=2026-02-23T10:48:38.151Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 23 10:48:38 compute-0 podman_exporter[196985]: ts=2026-02-23T10:48:38.151Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 23 10:48:38 compute-0 podman_exporter[196985]: ts=2026-02-23T10:48:38.151Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 23 10:48:38 compute-0 podman_exporter[196985]: ts=2026-02-23T10:48:38.151Z caller=handler.go:105 level=info collector=container
Feb 23 10:48:38 compute-0 podman[196969]: 2026-02-23 10:48:38.168437606 +0000 UTC m=+0.140440974 container start d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:48:38 compute-0 podman[196969]: podman_exporter
Feb 23 10:48:38 compute-0 systemd[1]: Starting Podman API Service...
Feb 23 10:48:38 compute-0 systemd[1]: Started podman_exporter container.
Feb 23 10:48:38 compute-0 systemd[1]: Started Podman API Service.
Feb 23 10:48:38 compute-0 sudo[196920]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:38 compute-0 podman[197002]: time="2026-02-23T10:48:38Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 23 10:48:38 compute-0 podman[197002]: time="2026-02-23T10:48:38Z" level=info msg="Setting parallel job count to 25"
Feb 23 10:48:38 compute-0 podman[197002]: time="2026-02-23T10:48:38Z" level=info msg="Using sqlite as database backend"
Feb 23 10:48:38 compute-0 podman[197002]: time="2026-02-23T10:48:38Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 23 10:48:38 compute-0 podman[197002]: time="2026-02-23T10:48:38Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 23 10:48:38 compute-0 podman[197002]: time="2026-02-23T10:48:38Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 23 10:48:38 compute-0 podman[197002]: @ - - [23/Feb/2026:10:48:38 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 23 10:48:38 compute-0 podman[197002]: time="2026-02-23T10:48:38Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:48:38 compute-0 podman[196996]: 2026-02-23 10:48:38.231430068 +0000 UTC m=+0.056575091 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:48:38 compute-0 systemd[1]: d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771-69573ed19c1f46e4.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 10:48:38 compute-0 systemd[1]: d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771-69573ed19c1f46e4.service: Failed with result 'exit-code'.
Feb 23 10:48:38 compute-0 podman[197002]: @ - - [23/Feb/2026:10:48:38 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12585 "" "Go-http-client/1.1"
Feb 23 10:48:38 compute-0 podman_exporter[196985]: ts=2026-02-23T10:48:38.241Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 23 10:48:38 compute-0 podman_exporter[196985]: ts=2026-02-23T10:48:38.241Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 23 10:48:38 compute-0 podman_exporter[196985]: ts=2026-02-23T10:48:38.242Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 23 10:48:39 compute-0 python3.9[197183]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 10:48:40 compute-0 sudo[197333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teyihmovubwdofmtrtjzbixxzkzarlxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843719.8971207-1051-81786520707711/AnsiballZ_stat.py'
Feb 23 10:48:40 compute-0 sudo[197333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:40 compute-0 python3.9[197336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:40 compute-0 sudo[197333]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:40 compute-0 sudo[197459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzjrsdfeoccrvezhbtizhzazsibrjkge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843719.8971207-1051-81786520707711/AnsiballZ_copy.py'
Feb 23 10:48:40 compute-0 sudo[197459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:40 compute-0 python3.9[197462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843719.8971207-1051-81786520707711/.source.yaml _original_basename=.1idtfykh follow=False checksum=ff46fcb6f19e4f3df5e6f8f6c793f85d9b62f29d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:40 compute-0 sudo[197459]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:41 compute-0 sudo[197612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuhpjxkvwufdqqvwrvldbrrnpjcwcqrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843721.096933-1081-262625952760163/AnsiballZ_stat.py'
Feb 23 10:48:41 compute-0 sudo[197612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:41 compute-0 python3.9[197615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:41 compute-0 sudo[197612]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:41 compute-0 sudo[197736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqgmfvsrslolhnuytsnrevhkhwcpjehm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843721.096933-1081-262625952760163/AnsiballZ_copy.py'
Feb 23 10:48:41 compute-0 sudo[197736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:42 compute-0 python3.9[197739]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771843721.096933-1081-262625952760163/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:42 compute-0 sudo[197736]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.694 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.694 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.724 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.724 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.725 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.725 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.725 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.726 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.726 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.727 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.727 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.759 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.759 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.760 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.760 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.894 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.895 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6067MB free_disk=73.38365936279297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.895 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.895 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.974 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.974 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:48:42 compute-0 nova_compute[187639]: 2026-02-23 10:48:42.998 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:48:43 compute-0 nova_compute[187639]: 2026-02-23 10:48:43.019 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:48:43 compute-0 nova_compute[187639]: 2026-02-23 10:48:43.021 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:48:43 compute-0 nova_compute[187639]: 2026-02-23 10:48:43.021 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:48:43 compute-0 sudo[197889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnbkoxwivlcqmslklevnhbxfyviumeha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843722.9597952-1123-81937139953582/AnsiballZ_file.py'
Feb 23 10:48:43 compute-0 sudo[197889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:43 compute-0 python3.9[197892]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:43 compute-0 sudo[197889]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:43 compute-0 sudo[198042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlroyalckyveqqathhqpxpyytdqymgij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843723.6716833-1139-246320348581040/AnsiballZ_file.py'
Feb 23 10:48:43 compute-0 sudo[198042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:44 compute-0 python3.9[198045]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 10:48:44 compute-0 sudo[198042]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:44 compute-0 python3.9[198195]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:46 compute-0 sudo[198624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnmccrctnjzthebvqgxivvcffgivtmqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843726.3941414-1207-172336777724323/AnsiballZ_container_config_data.py'
Feb 23 10:48:46 compute-0 sudo[198624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:46 compute-0 podman[198590]: 2026-02-23 10:48:46.722924974 +0000 UTC m=+0.060549968 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 10:48:46 compute-0 python3.9[198636]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 23 10:48:46 compute-0 sudo[198624]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:47 compute-0 sshd-session[198640]: Connection closed by authenticating user root 143.198.30.3 port 35282 [preauth]
Feb 23 10:48:47 compute-0 sudo[198791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfrnwnvgfmmfqtzryhsjkfsllldnpfgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843727.3781734-1229-248881222285931/AnsiballZ_container_config_hash.py'
Feb 23 10:48:47 compute-0 sudo[198791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:47 compute-0 python3.9[198794]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 10:48:47 compute-0 sudo[198791]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:48 compute-0 sudo[198944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niyivilporvhjfaanvbffclhtghdjdaw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843728.347848-1249-165768524249059/AnsiballZ_edpm_container_manage.py'
Feb 23 10:48:48 compute-0 sudo[198944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:48 compute-0 python3[198947]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 10:48:50 compute-0 podman[199005]: 2026-02-23 10:48:50.878443903 +0000 UTC m=+0.082965605 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 23 10:48:51 compute-0 podman[198960]: 2026-02-23 10:48:51.238511706 +0000 UTC m=+2.305290667 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 23 10:48:51 compute-0 podman[199085]: 2026-02-23 10:48:51.362138905 +0000 UTC m=+0.038746476 container create ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=openstack_network_exporter, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, managed_by=edpm_ansible, distribution-scope=public, vcs-type=git)
Feb 23 10:48:51 compute-0 podman[199085]: 2026-02-23 10:48:51.341441403 +0000 UTC m=+0.018048944 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 23 10:48:51 compute-0 python3[198947]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 23 10:48:51 compute-0 sudo[198944]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:52 compute-0 sudo[199273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvtpxjacqxkymazlpdpqqiaombmyfeik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843732.0250099-1265-150071821154686/AnsiballZ_stat.py'
Feb 23 10:48:52 compute-0 sudo[199273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:52 compute-0 python3.9[199276]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:52 compute-0 sudo[199273]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:53 compute-0 sudo[199428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zskhswstygnssoohrngkcciznemlpqof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843732.8226993-1283-266598247044236/AnsiballZ_file.py'
Feb 23 10:48:53 compute-0 sudo[199428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:53 compute-0 python3.9[199431]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:53 compute-0 sudo[199428]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:53 compute-0 sudo[199505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pasphmzjkbyfxkiebqymbkpybesfnbkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843732.8226993-1283-266598247044236/AnsiballZ_stat.py'
Feb 23 10:48:53 compute-0 sudo[199505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:53 compute-0 python3.9[199508]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:48:53 compute-0 sudo[199505]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:54 compute-0 sudo[199657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awgxjqcvqaymxgcshiwysxaxqbccgzuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843733.6465535-1283-212912001243747/AnsiballZ_copy.py'
Feb 23 10:48:54 compute-0 sudo[199657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:54 compute-0 python3.9[199660]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771843733.6465535-1283-212912001243747/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:54 compute-0 sudo[199657]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:54 compute-0 sudo[199734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhjoywuheqtywovginxytnwrhudnnlpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843733.6465535-1283-212912001243747/AnsiballZ_systemd.py'
Feb 23 10:48:54 compute-0 sudo[199734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:54 compute-0 python3.9[199737]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 10:48:54 compute-0 systemd[1]: Reloading.
Feb 23 10:48:54 compute-0 systemd-sysv-generator[199764]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:48:54 compute-0 systemd-rc-local-generator[199758]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:48:55 compute-0 sudo[199734]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:55 compute-0 sudo[199853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-essregkcnmrpbeluknaplekmjpeuzhql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843733.6465535-1283-212912001243747/AnsiballZ_systemd.py'
Feb 23 10:48:55 compute-0 sudo[199853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:55 compute-0 python3.9[199856]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 10:48:55 compute-0 systemd[1]: Reloading.
Feb 23 10:48:55 compute-0 systemd-rc-local-generator[199877]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 10:48:55 compute-0 systemd-sysv-generator[199884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 10:48:55 compute-0 systemd[1]: Starting openstack_network_exporter container...
Feb 23 10:48:55 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:48:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8dc1aa832fcb4489f6ee9fe584e4c3ec3cdef08db5769eaf6020f5d10bf5bbe3/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 23 10:48:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8dc1aa832fcb4489f6ee9fe584e4c3ec3cdef08db5769eaf6020f5d10bf5bbe3/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 23 10:48:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8dc1aa832fcb4489f6ee9fe584e4c3ec3cdef08db5769eaf6020f5d10bf5bbe3/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 23 10:48:55 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a.
Feb 23 10:48:55 compute-0 podman[199903]: 2026-02-23 10:48:55.965346862 +0000 UTC m=+0.101212923 container init ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64)
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *bridge.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *coverage.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *datapath.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *iface.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *memory.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *ovn.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *pmd_perf.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *pmd_rxq.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: INFO    10:48:55 main.go:48: registering *vswitch.Collector
Feb 23 10:48:55 compute-0 openstack_network_exporter[199919]: NOTICE  10:48:55 main.go:76: listening on https://:9105/metrics
Feb 23 10:48:55 compute-0 podman[199903]: 2026-02-23 10:48:55.988475568 +0000 UTC m=+0.124341609 container start ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 10:48:55 compute-0 podman[199903]: openstack_network_exporter
Feb 23 10:48:56 compute-0 systemd[1]: Started openstack_network_exporter container.
Feb 23 10:48:56 compute-0 sudo[199853]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:56 compute-0 podman[199930]: 2026-02-23 10:48:56.072431407 +0000 UTC m=+0.070991481 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 10:48:57 compute-0 python3.9[200102]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 10:48:57 compute-0 sshd-session[200127]: Connection closed by authenticating user root 165.227.79.48 port 57280 [preauth]
Feb 23 10:48:58 compute-0 sudo[200254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwcghypvmuizgszqgbgfdlexhcaeskdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843737.8147047-1373-126470038153189/AnsiballZ_stat.py'
Feb 23 10:48:58 compute-0 sudo[200254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:58 compute-0 python3.9[200257]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:48:58 compute-0 sudo[200254]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:58 compute-0 sudo[200380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klokkzjhxfkqrmjbgfwjhiqofzuhhszg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843737.8147047-1373-126470038153189/AnsiballZ_copy.py'
Feb 23 10:48:58 compute-0 sudo[200380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:58 compute-0 python3.9[200383]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843737.8147047-1373-126470038153189/.source.yaml _original_basename=.h33jejlu follow=False checksum=c170f2f66e217565b1a8aa7f33ce5df6a61773a7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:48:58 compute-0 sudo[200380]: pam_unix(sudo:session): session closed for user root
Feb 23 10:48:59 compute-0 sudo[200533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oulubmtesdovuqrljgmcqnulzufqbktn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843739.1001847-1403-214202621844998/AnsiballZ_find.py'
Feb 23 10:48:59 compute-0 sudo[200533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:48:59 compute-0 python3.9[200536]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 10:48:59 compute-0 sudo[200533]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:00 compute-0 sudo[200686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iukueepqlwbreczqhpjlgmxlsebwuadb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843740.0767593-1422-132871809575782/AnsiballZ_podman_container_info.py'
Feb 23 10:49:00 compute-0 sudo[200686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:00 compute-0 python3.9[200689]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 23 10:49:00 compute-0 sudo[200686]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:01 compute-0 sudo[200852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivhrycakexlkedeldhbdkflslqomkhoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843740.942654-1430-113047005707086/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:01 compute-0 sudo[200852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:01 compute-0 python3.9[200855]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:01 compute-0 systemd[1]: Started libpod-conmon-13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1.scope.
Feb 23 10:49:01 compute-0 podman[200856]: 2026-02-23 10:49:01.666895175 +0000 UTC m=+0.098247155 container exec 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Feb 23 10:49:01 compute-0 podman[200856]: 2026-02-23 10:49:01.702421885 +0000 UTC m=+0.133773855 container exec_died 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 23 10:49:01 compute-0 systemd[1]: libpod-conmon-13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1.scope: Deactivated successfully.
Feb 23 10:49:01 compute-0 sudo[200852]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:02 compute-0 sudo[201035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlvykewgkktbkammbjrrihbitfgtgjxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843741.9281628-1438-146812155853795/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:02 compute-0 sudo[201035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:02 compute-0 python3.9[201038]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:02 compute-0 systemd[1]: Started libpod-conmon-13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1.scope.
Feb 23 10:49:02 compute-0 podman[201039]: 2026-02-23 10:49:02.525472328 +0000 UTC m=+0.071490444 container exec 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:49:02 compute-0 podman[201039]: 2026-02-23 10:49:02.556462279 +0000 UTC m=+0.102480405 container exec_died 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:49:02 compute-0 systemd[1]: libpod-conmon-13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1.scope: Deactivated successfully.
Feb 23 10:49:02 compute-0 sudo[201035]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:03 compute-0 sudo[201222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvdvrzullcapmwuxayyjkidktxztkvgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843742.763953-1446-103082291888093/AnsiballZ_file.py'
Feb 23 10:49:03 compute-0 sudo[201222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:03 compute-0 python3.9[201225]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:03 compute-0 sudo[201222]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:03 compute-0 sudo[201375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drkxkomsfwcrinlsqoomzadgzclnzjtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843743.4706023-1455-176829256099816/AnsiballZ_podman_container_info.py'
Feb 23 10:49:03 compute-0 sudo[201375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:03 compute-0 python3.9[201378]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 23 10:49:03 compute-0 sudo[201375]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:04 compute-0 sudo[201541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlklufmzploihmovgpcyqxuykcqaigbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843744.2228808-1463-117969904578393/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:04 compute-0 sudo[201541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:04 compute-0 python3.9[201544]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:04 compute-0 systemd[1]: Started libpod-conmon-7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7.scope.
Feb 23 10:49:04 compute-0 podman[201545]: 2026-02-23 10:49:04.737573702 +0000 UTC m=+0.088635113 container exec 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:49:04 compute-0 podman[201545]: 2026-02-23 10:49:04.773180995 +0000 UTC m=+0.124242446 container exec_died 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:49:04 compute-0 systemd[1]: libpod-conmon-7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7.scope: Deactivated successfully.
Feb 23 10:49:04 compute-0 sudo[201541]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:05 compute-0 sudo[201727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iailskyzekblbermltjgggbmaegkebrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843744.9644206-1471-21442563076843/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:05 compute-0 sudo[201727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:05 compute-0 python3.9[201730]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:05 compute-0 systemd[1]: Started libpod-conmon-7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7.scope.
Feb 23 10:49:05 compute-0 podman[201731]: 2026-02-23 10:49:05.438869125 +0000 UTC m=+0.079118583 container exec 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:49:05 compute-0 podman[201751]: 2026-02-23 10:49:05.494796241 +0000 UTC m=+0.047386883 container exec_died 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 10:49:05 compute-0 podman[201731]: 2026-02-23 10:49:05.500262144 +0000 UTC m=+0.140511612 container exec_died 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:49:05 compute-0 systemd[1]: libpod-conmon-7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7.scope: Deactivated successfully.
Feb 23 10:49:05 compute-0 sudo[201727]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:05 compute-0 sudo[201913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvqksmbukhnxptbyrrvyclsqigsfuozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843745.6904824-1479-277016943199841/AnsiballZ_file.py'
Feb 23 10:49:05 compute-0 sudo[201913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:06 compute-0 python3.9[201916]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:06 compute-0 sudo[201913]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:06 compute-0 sudo[202066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsgbbrdamxycognwrnrfdxvbrncastpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843746.3748243-1488-69655780876347/AnsiballZ_podman_container_info.py'
Feb 23 10:49:06 compute-0 sudo[202066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:06 compute-0 python3.9[202069]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 23 10:49:06 compute-0 sudo[202066]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:07 compute-0 sudo[202232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-barabucmunaimctdxoiygzqlemteepun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843747.0303733-1496-10042633740995/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:07 compute-0 sudo[202232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:07 compute-0 python3.9[202235]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:07 compute-0 systemd[1]: Started libpod-conmon-d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771.scope.
Feb 23 10:49:07 compute-0 podman[202236]: 2026-02-23 10:49:07.609758769 +0000 UTC m=+0.071597586 container exec d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:49:07 compute-0 podman[202236]: 2026-02-23 10:49:07.638814721 +0000 UTC m=+0.100653518 container exec_died d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:49:07 compute-0 systemd[1]: libpod-conmon-d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771.scope: Deactivated successfully.
Feb 23 10:49:07 compute-0 sudo[202232]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:08 compute-0 sudo[202415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahxkpuveojbnosjhxooawscnpuvwjhme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843747.824172-1504-6464621902933/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:08 compute-0 sudo[202415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:08 compute-0 python3.9[202418]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:08 compute-0 systemd[1]: Started libpod-conmon-d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771.scope.
Feb 23 10:49:08 compute-0 podman[202419]: 2026-02-23 10:49:08.314362029 +0000 UTC m=+0.077025509 container exec d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:49:08 compute-0 podman[202419]: 2026-02-23 10:49:08.3621061 +0000 UTC m=+0.124769610 container exec_died d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:49:08 compute-0 systemd[1]: libpod-conmon-d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771.scope: Deactivated successfully.
Feb 23 10:49:08 compute-0 podman[202437]: 2026-02-23 10:49:08.373268062 +0000 UTC m=+0.054614811 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:49:08 compute-0 sudo[202415]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:08 compute-0 sudo[202625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtkivduzqdxaqplucebicbleyqsjwrln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843748.5517986-1512-121816210594005/AnsiballZ_file.py'
Feb 23 10:49:08 compute-0 sudo[202625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:09 compute-0 python3.9[202628]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:09 compute-0 sudo[202625]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:09 compute-0 sudo[202778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixfygoniwzothcceapqbofznbwnwclam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843749.2046735-1521-218351006092017/AnsiballZ_podman_container_info.py'
Feb 23 10:49:09 compute-0 sudo[202778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:09 compute-0 python3.9[202781]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 23 10:49:09 compute-0 sudo[202778]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:10 compute-0 sudo[202944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vajwjssnbwomnoxiibgiqaxubhnbysoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843749.9976292-1529-162640624369157/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:10 compute-0 sudo[202944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:10 compute-0 python3.9[202947]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:10 compute-0 systemd[1]: Started libpod-conmon-ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a.scope.
Feb 23 10:49:10 compute-0 podman[202948]: 2026-02-23 10:49:10.604702173 +0000 UTC m=+0.086217160 container exec ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal)
Feb 23 10:49:10 compute-0 podman[202948]: 2026-02-23 10:49:10.635879599 +0000 UTC m=+0.117394616 container exec_died ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, architecture=x86_64, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, version=9.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter)
Feb 23 10:49:10 compute-0 systemd[1]: libpod-conmon-ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a.scope: Deactivated successfully.
Feb 23 10:49:10 compute-0 sudo[202944]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:11 compute-0 sudo[203130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwzvwfomyfnjffedwhzlevfjwsovsmeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843750.8539445-1537-5378776578164/AnsiballZ_podman_container_exec.py'
Feb 23 10:49:11 compute-0 sudo[203130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:11 compute-0 python3.9[203133]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 10:49:11 compute-0 systemd[1]: Started libpod-conmon-ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a.scope.
Feb 23 10:49:11 compute-0 podman[203134]: 2026-02-23 10:49:11.408246264 +0000 UTC m=+0.077152012 container exec ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, architecture=x86_64)
Feb 23 10:49:11 compute-0 podman[203134]: 2026-02-23 10:49:11.437596503 +0000 UTC m=+0.106502251 container exec_died ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, distribution-scope=public, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 23 10:49:11 compute-0 systemd[1]: libpod-conmon-ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a.scope: Deactivated successfully.
Feb 23 10:49:11 compute-0 sudo[203130]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:11 compute-0 sudo[203316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqxdfbzafdziwzkwphoxylbmaeonpcps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843751.6542382-1545-75275258803823/AnsiballZ_file.py'
Feb 23 10:49:11 compute-0 sudo[203316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:12 compute-0 python3.9[203319]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:12 compute-0 sudo[203316]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:49:12.628 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:49:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:49:12.630 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:49:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:49:12.630 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:49:16 compute-0 podman[203344]: 2026-02-23 10:49:16.85232022 +0000 UTC m=+0.056719457 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 10:49:21 compute-0 sshd-session[203364]: Connection closed by authenticating user root 143.198.30.3 port 41950 [preauth]
Feb 23 10:49:21 compute-0 podman[203366]: 2026-02-23 10:49:21.896905541 +0000 UTC m=+0.094187789 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible)
Feb 23 10:49:26 compute-0 sudo[203532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvuqmavnspatvygelddjylezqxzajfoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843766.0743604-1687-181162393582148/AnsiballZ_file.py'
Feb 23 10:49:26 compute-0 podman[203491]: 2026-02-23 10:49:26.342491919 +0000 UTC m=+0.056617914 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347)
Feb 23 10:49:26 compute-0 sudo[203532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:26 compute-0 python3.9[203540]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:26 compute-0 sudo[203532]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:27 compute-0 sudo[203690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tabzmhrldgunqogikvamxflhlyngbjjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843766.7503371-1703-167837104470907/AnsiballZ_stat.py'
Feb 23 10:49:27 compute-0 sudo[203690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:27 compute-0 python3.9[203693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:27 compute-0 sudo[203690]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:27 compute-0 sudo[203814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odhfedvirmsixwhabsnwvqpauegszcjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843766.7503371-1703-167837104470907/AnsiballZ_copy.py'
Feb 23 10:49:27 compute-0 sudo[203814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:27 compute-0 python3.9[203817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771843766.7503371-1703-167837104470907/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:27 compute-0 sudo[203814]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:28 compute-0 sudo[203967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzbcreqkuyblrvdpqpyuvqhkeucjtymz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843768.1340058-1735-70882934924035/AnsiballZ_file.py'
Feb 23 10:49:28 compute-0 sudo[203967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:28 compute-0 python3.9[203970]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:28 compute-0 sudo[203967]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:29 compute-0 sudo[204120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-issjnmqervxhicjuqdtjxrdqzpmwhuly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843768.7567856-1751-124857107940682/AnsiballZ_stat.py'
Feb 23 10:49:29 compute-0 sudo[204120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:29 compute-0 python3.9[204123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:29 compute-0 sudo[204120]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:29 compute-0 sudo[204199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umutllpdfuluupjzabrshfrdvkaukrev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843768.7567856-1751-124857107940682/AnsiballZ_file.py'
Feb 23 10:49:29 compute-0 sudo[204199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:29 compute-0 python3.9[204202]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:29 compute-0 sudo[204199]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:30 compute-0 sudo[204352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnvtegwxhbhmhopatfnqnfaanblwslmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843769.9630916-1775-85484380123311/AnsiballZ_stat.py'
Feb 23 10:49:30 compute-0 sudo[204352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:30 compute-0 python3.9[204355]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:30 compute-0 sudo[204352]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:30 compute-0 sudo[204431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dopftsbpeycluxcfmmyxeyjxefaozplz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843769.9630916-1775-85484380123311/AnsiballZ_file.py'
Feb 23 10:49:30 compute-0 sudo[204431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:30 compute-0 python3.9[204434]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.s7qxvpbv recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:30 compute-0 sudo[204431]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:31 compute-0 sudo[204584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saegralzdubnxeznjgqmjxspdadtqfrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843771.0186548-1799-93923634359036/AnsiballZ_stat.py'
Feb 23 10:49:31 compute-0 sudo[204584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:31 compute-0 python3.9[204587]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:31 compute-0 sudo[204584]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:31 compute-0 sudo[204663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqqeyoiemxrnzzvcgtjocfejrockarlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843771.0186548-1799-93923634359036/AnsiballZ_file.py'
Feb 23 10:49:31 compute-0 sudo[204663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:31 compute-0 python3.9[204666]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:31 compute-0 sudo[204663]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:32 compute-0 sudo[204816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkclhbizqnpkvtflttpsargmlcutwhkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843772.1729105-1825-109077219247385/AnsiballZ_command.py'
Feb 23 10:49:32 compute-0 sudo[204816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:32 compute-0 python3.9[204819]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:49:32 compute-0 sudo[204816]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:33 compute-0 sudo[204970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufedkptkzhdlfutoynrmypboxgbmrlvv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771843772.9644196-1841-212613462314180/AnsiballZ_edpm_nftables_from_files.py'
Feb 23 10:49:33 compute-0 sudo[204970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:33 compute-0 python3[204973]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 10:49:33 compute-0 sudo[204970]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:34 compute-0 sudo[205123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eitknlhftketldcxhzicjswjotoatuvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843773.8161204-1857-248516795215482/AnsiballZ_stat.py'
Feb 23 10:49:34 compute-0 sudo[205123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:34 compute-0 python3.9[205126]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:34 compute-0 sudo[205123]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:34 compute-0 sudo[205202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgznxvvblpantludbeylntccasiupism ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843773.8161204-1857-248516795215482/AnsiballZ_file.py'
Feb 23 10:49:34 compute-0 sudo[205202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:34 compute-0 python3.9[205205]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:34 compute-0 sudo[205202]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:35 compute-0 sudo[205355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmbbtuiogucvtnxbjipyomwfjfhclehr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843775.0408752-1881-7731755331812/AnsiballZ_stat.py'
Feb 23 10:49:35 compute-0 sudo[205355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:35 compute-0 python3.9[205358]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:35 compute-0 sudo[205355]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:35 compute-0 sudo[205434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agdzsuqrlleteoyqysksnqrggobzhpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843775.0408752-1881-7731755331812/AnsiballZ_file.py'
Feb 23 10:49:35 compute-0 sudo[205434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:35 compute-0 python3.9[205437]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:35 compute-0 sudo[205434]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:36 compute-0 sudo[205587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scgdtzrfjwxbizopodrzbjhapaoggzvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843776.2866073-1905-18309164134417/AnsiballZ_stat.py'
Feb 23 10:49:36 compute-0 sudo[205587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:36 compute-0 python3.9[205590]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:36 compute-0 sudo[205587]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:36 compute-0 sudo[205666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsxekzurkkyykzvbaxmxlpcpjhywneyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843776.2866073-1905-18309164134417/AnsiballZ_file.py'
Feb 23 10:49:36 compute-0 sudo[205666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:37 compute-0 python3.9[205669]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:37 compute-0 sudo[205666]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:37 compute-0 sudo[205819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgqlcwxyphixesudsnurzwktwehehkyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843777.559642-1929-125183934553842/AnsiballZ_stat.py'
Feb 23 10:49:37 compute-0 sudo[205819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:38 compute-0 python3.9[205822]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:38 compute-0 sudo[205819]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:38 compute-0 sudo[205898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tccdnjwjcujutixniqgiaizuflmxbldi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843777.559642-1929-125183934553842/AnsiballZ_file.py'
Feb 23 10:49:38 compute-0 sudo[205898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:38 compute-0 python3.9[205901]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:38 compute-0 sudo[205898]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:38 compute-0 podman[205926]: 2026-02-23 10:49:38.851301541 +0000 UTC m=+0.056080540 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:49:39 compute-0 sudo[206075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbaypockgsuvgtjyfogwfvzkhoktxrxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843778.8689358-1953-214932082368189/AnsiballZ_stat.py'
Feb 23 10:49:39 compute-0 sudo[206075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:39 compute-0 python3.9[206078]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 10:49:39 compute-0 sudo[206075]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:39 compute-0 sudo[206201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taquezeuhggetbnbdzggxghhaiogupbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843778.8689358-1953-214932082368189/AnsiballZ_copy.py'
Feb 23 10:49:39 compute-0 sudo[206201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:39 compute-0 python3.9[206204]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771843778.8689358-1953-214932082368189/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:39 compute-0 sudo[206201]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:40 compute-0 sudo[206354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyttpqonnwhqrhmmmjaavthkamswlsea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843780.331662-1983-249117376725659/AnsiballZ_file.py'
Feb 23 10:49:40 compute-0 sudo[206354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:40 compute-0 python3.9[206357]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:40 compute-0 sudo[206354]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:41 compute-0 sudo[206507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyxsfigqmddfhqmjmpunmnhplbngfkmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843780.9859262-1999-16329906990724/AnsiballZ_command.py'
Feb 23 10:49:41 compute-0 sudo[206507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:41 compute-0 python3.9[206510]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:49:41 compute-0 sudo[206507]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:42 compute-0 sudo[206663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shmvcuowzhyeuarklxrbegswerkaosld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843781.707249-2015-102692547670381/AnsiballZ_blockinfile.py'
Feb 23 10:49:42 compute-0 sudo[206663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:42 compute-0 python3.9[206666]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:42 compute-0 sudo[206663]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:42 compute-0 sudo[206816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pihkfpqtqjnbovudkyguyoxgyhakradn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843782.5596092-2033-100919798294899/AnsiballZ_command.py'
Feb 23 10:49:42 compute-0 sudo[206816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:42 compute-0 python3.9[206819]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:49:43 compute-0 sudo[206816]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.026 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.027 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.052 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.053 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.083 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.083 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.084 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.084 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.188 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.189 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5947MB free_disk=73.24344253540039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.189 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.189 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.280 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.280 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.308 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.322 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.323 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.323 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:49:43 compute-0 sudo[206970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-girepcpruqzfkgfjpjcdfqybalydcczd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843783.2845926-2049-268667949320190/AnsiballZ_stat.py'
Feb 23 10:49:43 compute-0 sudo[206970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:43 compute-0 python3.9[206973]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 10:49:43 compute-0 sudo[206970]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:43 compute-0 sshd-session[207000]: Connection closed by authenticating user root 165.227.79.48 port 56800 [preauth]
Feb 23 10:49:43 compute-0 nova_compute[187639]: 2026-02-23 10:49:43.962 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:44 compute-0 sudo[207127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gugshoqypluyhbrciyrfurztbffhmchj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843784.016596-2065-166222494729453/AnsiballZ_command.py'
Feb 23 10:49:44 compute-0 sudo[207127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:44 compute-0 python3.9[207130]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:49:44 compute-0 sudo[207127]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.710 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.711 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.711 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.712 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.712 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:49:44 compute-0 nova_compute[187639]: 2026-02-23 10:49:44.712 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:49:44 compute-0 sudo[207283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfpmrzdmgoqauluncfeoeebahthekutn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771843784.6822088-2081-198338352447264/AnsiballZ_file.py'
Feb 23 10:49:44 compute-0 sudo[207283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 10:49:45 compute-0 python3.9[207286]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 10:49:45 compute-0 sudo[207283]: pam_unix(sudo:session): session closed for user root
Feb 23 10:49:45 compute-0 sshd-session[187967]: Connection closed by 192.168.122.30 port 42752
Feb 23 10:49:45 compute-0 sshd-session[187964]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:49:45 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Feb 23 10:49:45 compute-0 systemd[1]: session-26.scope: Consumed 1min 5.408s CPU time.
Feb 23 10:49:45 compute-0 systemd-logind[808]: Session 26 logged out. Waiting for processes to exit.
Feb 23 10:49:45 compute-0 systemd-logind[808]: Removed session 26.
Feb 23 10:49:47 compute-0 podman[207311]: 2026-02-23 10:49:47.870345676 +0000 UTC m=+0.067268556 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 23 10:49:52 compute-0 podman[207331]: 2026-02-23 10:49:52.9124087 +0000 UTC m=+0.108011280 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 10:49:53 compute-0 sshd-session[207359]: Connection closed by authenticating user root 143.198.30.3 port 60196 [preauth]
Feb 23 10:49:56 compute-0 podman[207361]: 2026-02-23 10:49:56.879573598 +0000 UTC m=+0.084356164 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7)
Feb 23 10:49:59 compute-0 podman[197002]: time="2026-02-23T10:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:49:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:49:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2136 "" "Go-http-client/1.1"
Feb 23 10:50:01 compute-0 openstack_network_exporter[199919]: ERROR   10:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:50:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:50:01 compute-0 openstack_network_exporter[199919]: ERROR   10:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:50:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:50:09 compute-0 podman[207388]: 2026-02-23 10:50:09.860922549 +0000 UTC m=+0.060046192 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:50:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:50:12.629 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:50:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:50:12.630 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:50:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:50:12.630 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:50:18 compute-0 podman[207413]: 2026-02-23 10:50:18.838789469 +0000 UTC m=+0.045837631 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:50:23 compute-0 podman[207434]: 2026-02-23 10:50:23.858478713 +0000 UTC m=+0.064605304 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 23 10:50:27 compute-0 sshd-session[207461]: Connection closed by authenticating user root 143.198.30.3 port 33724 [preauth]
Feb 23 10:50:27 compute-0 podman[207463]: 2026-02-23 10:50:27.842448882 +0000 UTC m=+0.048830731 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Feb 23 10:50:28 compute-0 sshd-session[207484]: Connection closed by authenticating user root 165.227.79.48 port 47060 [preauth]
Feb 23 10:50:40 compute-0 podman[207486]: 2026-02-23 10:50:40.866803219 +0000 UTC m=+0.065418017 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.730 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.730 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.730 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.730 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.883 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.884 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6108MB free_disk=73.24349975585938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.884 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.885 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.950 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.951 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.984 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:50:43 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.998 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:50:44 compute-0 nova_compute[187639]: 2026-02-23 10:50:43.999 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:50:44 compute-0 nova_compute[187639]: 2026-02-23 10:50:44.000 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.000 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.000 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.000 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.719 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:50:45 compute-0 nova_compute[187639]: 2026-02-23 10:50:45.720 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:46 compute-0 nova_compute[187639]: 2026-02-23 10:50:46.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:46 compute-0 nova_compute[187639]: 2026-02-23 10:50:46.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:50:46 compute-0 nova_compute[187639]: 2026-02-23 10:50:46.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:50:49 compute-0 podman[207512]: 2026-02-23 10:50:49.8738446 +0000 UTC m=+0.066875855 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:50:54 compute-0 podman[207532]: 2026-02-23 10:50:54.86731833 +0000 UTC m=+0.071829227 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260216)
Feb 23 10:50:58 compute-0 podman[207558]: 2026-02-23 10:50:58.883084873 +0000 UTC m=+0.076767252 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1770267347, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64)
Feb 23 10:50:58 compute-0 sshd-session[207564]: Connection closed by authenticating user root 143.198.30.3 port 49258 [preauth]
Feb 23 10:50:59 compute-0 podman[197002]: time="2026-02-23T10:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:50:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:50:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2143 "" "Go-http-client/1.1"
Feb 23 10:51:01 compute-0 openstack_network_exporter[199919]: ERROR   10:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:51:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:51:01 compute-0 openstack_network_exporter[199919]: ERROR   10:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:51:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:51:04 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:04.993 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:51:04 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:04.994 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:51:04 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:04.995 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:51:05 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:05.084 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:51:05 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:05.086 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:51:07 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:07.087 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:51:11 compute-0 podman[207582]: 2026-02-23 10:51:11.855711933 +0000 UTC m=+0.055156151 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:51:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:12.631 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:51:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:12.631 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:51:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:51:12.632 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:51:15 compute-0 sshd-session[207607]: Connection closed by authenticating user root 165.227.79.48 port 34516 [preauth]
Feb 23 10:51:20 compute-0 podman[207609]: 2026-02-23 10:51:20.83964988 +0000 UTC m=+0.043233734 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 23 10:51:25 compute-0 podman[207628]: 2026-02-23 10:51:25.905270732 +0000 UTC m=+0.113217590 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 10:51:29 compute-0 podman[197002]: time="2026-02-23T10:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:51:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:51:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2142 "" "Go-http-client/1.1"
Feb 23 10:51:29 compute-0 podman[207657]: 2026-02-23 10:51:29.859877142 +0000 UTC m=+0.062052939 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Feb 23 10:51:31 compute-0 openstack_network_exporter[199919]: ERROR   10:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:51:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:51:31 compute-0 openstack_network_exporter[199919]: ERROR   10:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:51:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:51:32 compute-0 sshd-session[207678]: Connection closed by authenticating user root 143.198.30.3 port 53072 [preauth]
Feb 23 10:51:42 compute-0 podman[207680]: 2026-02-23 10:51:42.859590475 +0000 UTC m=+0.053877556 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:51:43 compute-0 rsyslogd[1017]: imjournal: 813 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 23 10:51:44 compute-0 nova_compute[187639]: 2026-02-23 10:51:44.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:44 compute-0 nova_compute[187639]: 2026-02-23 10:51:44.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.719 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.719 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.719 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.719 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.720 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.751 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.751 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.751 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.751 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.885 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.886 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6169MB free_disk=73.24349975585938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.886 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.887 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.956 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.957 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.976 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.996 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.998 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:51:45 compute-0 nova_compute[187639]: 2026-02-23 10:51:45.998 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:51:46 compute-0 nova_compute[187639]: 2026-02-23 10:51:46.968 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:46 compute-0 nova_compute[187639]: 2026-02-23 10:51:46.984 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:46 compute-0 nova_compute[187639]: 2026-02-23 10:51:46.984 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:51:47 compute-0 nova_compute[187639]: 2026-02-23 10:51:47.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:51:51 compute-0 podman[207705]: 2026-02-23 10:51:51.895364661 +0000 UTC m=+0.094096206 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 23 10:51:56 compute-0 podman[207724]: 2026-02-23 10:51:56.884342516 +0000 UTC m=+0.079718963 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216)
Feb 23 10:51:59 compute-0 podman[197002]: time="2026-02-23T10:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:51:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:51:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Feb 23 10:52:00 compute-0 podman[207752]: 2026-02-23 10:52:00.839972855 +0000 UTC m=+0.049449812 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Feb 23 10:52:01 compute-0 sshd-session[207773]: Connection closed by authenticating user root 165.227.79.48 port 59608 [preauth]
Feb 23 10:52:01 compute-0 openstack_network_exporter[199919]: ERROR   10:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:52:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:52:01 compute-0 openstack_network_exporter[199919]: ERROR   10:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:52:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:52:05 compute-0 sshd-session[207775]: Connection closed by authenticating user root 143.198.30.3 port 54070 [preauth]
Feb 23 10:52:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:52:12.633 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:52:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:52:12.634 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:52:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:52:12.634 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:52:13 compute-0 podman[207777]: 2026-02-23 10:52:13.862834994 +0000 UTC m=+0.071050436 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:52:22 compute-0 podman[207801]: 2026-02-23 10:52:22.866627882 +0000 UTC m=+0.073700259 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:52:27 compute-0 podman[207821]: 2026-02-23 10:52:27.862369973 +0000 UTC m=+0.065913734 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 10:52:29 compute-0 podman[197002]: time="2026-02-23T10:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:52:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:52:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2146 "" "Go-http-client/1.1"
Feb 23 10:52:31 compute-0 openstack_network_exporter[199919]: ERROR   10:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:52:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:52:31 compute-0 openstack_network_exporter[199919]: ERROR   10:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:52:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:52:31 compute-0 podman[207849]: 2026-02-23 10:52:31.86634306 +0000 UTC m=+0.067716944 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, release=1770267347, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Feb 23 10:52:37 compute-0 sshd-session[207870]: Connection closed by authenticating user root 143.198.30.3 port 39480 [preauth]
Feb 23 10:52:42 compute-0 nova_compute[187639]: 2026-02-23 10:52:42.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:42 compute-0 nova_compute[187639]: 2026-02-23 10:52:42.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 10:52:42 compute-0 nova_compute[187639]: 2026-02-23 10:52:42.710 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 10:52:42 compute-0 nova_compute[187639]: 2026-02-23 10:52:42.710 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:42 compute-0 nova_compute[187639]: 2026-02-23 10:52:42.711 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 10:52:42 compute-0 nova_compute[187639]: 2026-02-23 10:52:42.727 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:44 compute-0 podman[207872]: 2026-02-23 10:52:44.880284964 +0000 UTC m=+0.073359059 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:52:45 compute-0 nova_compute[187639]: 2026-02-23 10:52:45.737 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:45 compute-0 nova_compute[187639]: 2026-02-23 10:52:45.738 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:45 compute-0 nova_compute[187639]: 2026-02-23 10:52:45.738 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:52:45 compute-0 nova_compute[187639]: 2026-02-23 10:52:45.738 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:52:45 compute-0 nova_compute[187639]: 2026-02-23 10:52:45.750 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:52:46 compute-0 nova_compute[187639]: 2026-02-23 10:52:46.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:46 compute-0 nova_compute[187639]: 2026-02-23 10:52:46.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.694 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.694 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.732 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.733 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.733 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.734 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.939 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.941 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6196MB free_disk=73.24137115478516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.941 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:52:47 compute-0 nova_compute[187639]: 2026-02-23 10:52:47.941 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.075 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.076 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.151 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.253 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.253 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.269 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.294 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.321 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.335 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.338 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:52:48 compute-0 nova_compute[187639]: 2026-02-23 10:52:48.338 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:52:49 compute-0 sshd-session[207896]: Connection closed by authenticating user root 165.227.79.48 port 41698 [preauth]
Feb 23 10:52:53 compute-0 podman[207898]: 2026-02-23 10:52:53.847078896 +0000 UTC m=+0.052071194 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 10:52:58 compute-0 podman[207917]: 2026-02-23 10:52:58.859719862 +0000 UTC m=+0.066425559 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:52:59 compute-0 podman[197002]: time="2026-02-23T10:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:52:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:52:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2147 "" "Go-http-client/1.1"
Feb 23 10:53:01 compute-0 openstack_network_exporter[199919]: ERROR   10:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:53:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:53:01 compute-0 openstack_network_exporter[199919]: ERROR   10:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:53:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:53:02 compute-0 podman[207943]: 2026-02-23 10:53:02.85293458 +0000 UTC m=+0.056037414 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, distribution-scope=public, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc.)
Feb 23 10:53:10 compute-0 sshd-session[207964]: Connection closed by authenticating user root 143.198.30.3 port 45042 [preauth]
Feb 23 10:53:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:53:12.635 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:53:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:53:12.635 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:53:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:53:12.635 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:53:15 compute-0 podman[207966]: 2026-02-23 10:53:15.842549216 +0000 UTC m=+0.048745310 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:53:24 compute-0 podman[207991]: 2026-02-23 10:53:24.895886462 +0000 UTC m=+0.097897570 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Feb 23 10:53:29 compute-0 podman[197002]: time="2026-02-23T10:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:53:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:53:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2145 "" "Go-http-client/1.1"
Feb 23 10:53:29 compute-0 podman[208010]: 2026-02-23 10:53:29.924380077 +0000 UTC m=+0.129217684 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 23 10:53:31 compute-0 openstack_network_exporter[199919]: ERROR   10:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:53:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:53:31 compute-0 openstack_network_exporter[199919]: ERROR   10:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:53:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:53:33 compute-0 podman[208036]: 2026-02-23 10:53:33.838411335 +0000 UTC m=+0.045710330 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Feb 23 10:53:37 compute-0 sshd-session[208057]: Connection closed by authenticating user root 165.227.79.48 port 59244 [preauth]
Feb 23 10:53:43 compute-0 sshd-session[208059]: Connection closed by authenticating user root 143.198.30.3 port 44078 [preauth]
Feb 23 10:53:46 compute-0 podman[208061]: 2026-02-23 10:53:46.84866738 +0000 UTC m=+0.051794131 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.336 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.711 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.712 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.712 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.712 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.749 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.749 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.750 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.750 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.906 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.908 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6193MB free_disk=73.24139022827148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.908 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.908 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.988 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:53:47 compute-0 nova_compute[187639]: 2026-02-23 10:53:47.989 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:53:48 compute-0 nova_compute[187639]: 2026-02-23 10:53:48.007 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:53:48 compute-0 nova_compute[187639]: 2026-02-23 10:53:48.035 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:53:48 compute-0 nova_compute[187639]: 2026-02-23 10:53:48.036 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:53:48 compute-0 nova_compute[187639]: 2026-02-23 10:53:48.036 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:53:49 compute-0 nova_compute[187639]: 2026-02-23 10:53:49.014 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:49 compute-0 nova_compute[187639]: 2026-02-23 10:53:49.015 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:49 compute-0 nova_compute[187639]: 2026-02-23 10:53:49.015 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:53:49 compute-0 nova_compute[187639]: 2026-02-23 10:53:49.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:49 compute-0 nova_compute[187639]: 2026-02-23 10:53:49.706 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:53:55 compute-0 podman[208085]: 2026-02-23 10:53:55.843481986 +0000 UTC m=+0.050284771 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Feb 23 10:53:59 compute-0 podman[197002]: time="2026-02-23T10:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:53:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:53:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2146 "" "Go-http-client/1.1"
Feb 23 10:54:00 compute-0 podman[208104]: 2026-02-23 10:54:00.861840001 +0000 UTC m=+0.070323915 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 10:54:01 compute-0 openstack_network_exporter[199919]: ERROR   10:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:54:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:54:01 compute-0 openstack_network_exporter[199919]: ERROR   10:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:54:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:54:04 compute-0 podman[208132]: 2026-02-23 10:54:04.848451732 +0000 UTC m=+0.057341109 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Feb 23 10:54:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:54:12.636 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:54:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:54:12.637 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:54:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:54:12.637 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:54:17 compute-0 podman[208153]: 2026-02-23 10:54:17.868545455 +0000 UTC m=+0.073704441 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:54:18 compute-0 sshd-session[208177]: Connection closed by authenticating user root 143.198.30.3 port 50732 [preauth]
Feb 23 10:54:23 compute-0 sshd-session[208179]: Connection closed by authenticating user root 165.227.79.48 port 46490 [preauth]
Feb 23 10:54:26 compute-0 podman[208181]: 2026-02-23 10:54:26.827232454 +0000 UTC m=+0.035740626 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 10:54:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:54:28.054 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:54:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:54:28.055 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:54:29 compute-0 podman[197002]: time="2026-02-23T10:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:54:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:54:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Feb 23 10:54:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:54:30.058 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:54:31 compute-0 openstack_network_exporter[199919]: ERROR   10:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:54:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:54:31 compute-0 openstack_network_exporter[199919]: ERROR   10:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:54:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:54:31 compute-0 podman[208200]: 2026-02-23 10:54:31.856662877 +0000 UTC m=+0.059344061 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 10:54:35 compute-0 podman[208227]: 2026-02-23 10:54:35.866985016 +0000 UTC m=+0.068222996 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1770267347, io.buildah.version=1.33.7, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 10:54:46 compute-0 nova_compute[187639]: 2026-02-23 10:54:46.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.704 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.704 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.704 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.705 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.705 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.725 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.725 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.725 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.725 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.833 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.833 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6195MB free_disk=73.24139022827148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.834 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.834 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:54:48 compute-0 podman[208250]: 2026-02-23 10:54:48.852580072 +0000 UTC m=+0.050748274 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.890 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.890 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.912 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.923 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.924 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:54:48 compute-0 nova_compute[187639]: 2026-02-23 10:54:48.924 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:54:49 compute-0 nova_compute[187639]: 2026-02-23 10:54:49.911 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:49 compute-0 nova_compute[187639]: 2026-02-23 10:54:49.912 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:50 compute-0 sshd-session[208275]: Connection closed by authenticating user root 143.198.30.3 port 58360 [preauth]
Feb 23 10:54:51 compute-0 nova_compute[187639]: 2026-02-23 10:54:51.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:54:57 compute-0 podman[208277]: 2026-02-23 10:54:57.864903358 +0000 UTC m=+0.062772552 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 23 10:54:59 compute-0 podman[197002]: time="2026-02-23T10:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:54:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:54:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Feb 23 10:55:01 compute-0 openstack_network_exporter[199919]: ERROR   10:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:55:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:55:01 compute-0 openstack_network_exporter[199919]: ERROR   10:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:55:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:55:02 compute-0 podman[208297]: 2026-02-23 10:55:02.897809403 +0000 UTC m=+0.092809026 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 23 10:55:06 compute-0 podman[208325]: 2026-02-23 10:55:06.877997175 +0000 UTC m=+0.080738006 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, build-date=2026-02-05T04:57:10Z)
Feb 23 10:55:07 compute-0 sshd-session[208346]: Connection closed by authenticating user root 165.227.79.48 port 60518 [preauth]
Feb 23 10:55:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:12.637 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:12.638 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:12.638 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:19 compute-0 podman[208348]: 2026-02-23 10:55:19.878394428 +0000 UTC m=+0.085581519 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.460 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.460 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.482 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.568 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.569 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.575 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.576 187643 INFO nova.compute.claims [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Claim successful on node compute-0.ctlplane.example.com
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.682 187643 DEBUG nova.compute.provider_tree [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.696 187643 DEBUG nova.scheduler.client.report [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.714 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.716 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.763 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.764 187643 DEBUG nova.network.neutron [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.791 187643 INFO nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.813 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.926 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.928 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.928 187643 INFO nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Creating image(s)
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.929 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "/var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.929 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "/var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.930 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "/var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.930 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:20 compute-0 nova_compute[187639]: 2026-02-23 10:55:20.931 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:21 compute-0 nova_compute[187639]: 2026-02-23 10:55:21.483 187643 WARNING oslo_policy.policy [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 23 10:55:21 compute-0 nova_compute[187639]: 2026-02-23 10:55:21.484 187643 WARNING oslo_policy.policy [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 23 10:55:21 compute-0 nova_compute[187639]: 2026-02-23 10:55:21.485 187643 DEBUG nova.policy [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '38b4598a0d9649aaa7ba0cfac82e4414', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.254 187643 DEBUG nova.network.neutron [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Successfully created port: b6fd14a3-d411-4ffa-92c3-a38e98e7e599 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.491 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.567 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.part --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.569 187643 DEBUG nova.virt.images [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] 0ef805b1-b4a6-4839-ade3-d18a6c4b570e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.571 187643 DEBUG nova.privsep.utils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.571 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.part /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.712 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.part /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.converted" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.715 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.782 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29.converted --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.784 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:22 compute-0 nova_compute[187639]: 2026-02-23 10:55:22.813 187643 INFO oslo.privsep.daemon [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpt7077u6_/privsep.sock']
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.412 187643 DEBUG nova.network.neutron [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Successfully updated port: b6fd14a3-d411-4ffa-92c3-a38e98e7e599 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.434 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.435 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquired lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.435 187643 DEBUG nova.network.neutron [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.435 187643 INFO oslo.privsep.daemon [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Spawned new privsep daemon via rootwrap
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.333 208393 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.339 208393 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.343 208393 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.343 208393 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208393
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.500 187643 DEBUG nova.compute.manager [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received event network-changed-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.500 187643 DEBUG nova.compute.manager [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Refreshing instance network info cache due to event network-changed-b6fd14a3-d411-4ffa-92c3-a38e98e7e599. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.500 187643 DEBUG oslo_concurrency.lockutils [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.501 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.538 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.539 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.540 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.549 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.586 187643 DEBUG nova.network.neutron [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.590 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.591 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.622 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.623 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.623 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.663 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.664 187643 DEBUG nova.virt.disk.api [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Checking if we can resize image /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.664 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.723 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.724 187643 DEBUG nova.virt.disk.api [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Cannot resize image /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.724 187643 DEBUG nova.objects.instance [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'migration_context' on Instance uuid ff2f3092-5677-4653-b466-6507edb18e01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.738 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.738 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Ensure instance console log exists: /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.739 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.739 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:23 compute-0 nova_compute[187639]: 2026-02-23 10:55:23.739 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:24 compute-0 sshd-session[208410]: Connection closed by authenticating user root 143.198.30.3 port 37386 [preauth]
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.930 187643 DEBUG nova.network.neutron [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Updating instance_info_cache with network_info: [{"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.954 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Releasing lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.954 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Instance network_info: |[{"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.955 187643 DEBUG oslo_concurrency.lockutils [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.956 187643 DEBUG nova.network.neutron [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Refreshing network info cache for port b6fd14a3-d411-4ffa-92c3-a38e98e7e599 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.961 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Start _get_guest_xml network_info=[{"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.967 187643 WARNING nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.979 187643 DEBUG nova.virt.libvirt.host [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.980 187643 DEBUG nova.virt.libvirt.host [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.985 187643 DEBUG nova.virt.libvirt.host [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.986 187643 DEBUG nova.virt.libvirt.host [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.988 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.989 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.989 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.990 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.990 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.991 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.991 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.992 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.992 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.992 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.993 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.993 187643 DEBUG nova.virt.hardware [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 10:55:25 compute-0 nova_compute[187639]: 2026-02-23 10:55:25.999 187643 DEBUG nova.privsep.utils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.000 187643 DEBUG nova.virt.libvirt.vif [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T10:55:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1848147015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1848147015',id=2,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-gpmzgv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:55:20Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=ff2f3092-5677-4653-b466-6507edb18e01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.001 187643 DEBUG nova.network.os_vif_util [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.002 187643 DEBUG nova.network.os_vif_util [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:12:3e,bridge_name='br-int',has_traffic_filtering=True,id=b6fd14a3-d411-4ffa-92c3-a38e98e7e599,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6fd14a3-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.005 187643 DEBUG nova.objects.instance [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'pci_devices' on Instance uuid ff2f3092-5677-4653-b466-6507edb18e01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.023 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] End _get_guest_xml xml=<domain type="kvm">
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <uuid>ff2f3092-5677-4653-b466-6507edb18e01</uuid>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <name>instance-00000002</name>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <metadata>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1848147015</nova:name>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 10:55:25</nova:creationTime>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:user uuid="38b4598a0d9649aaa7ba0cfac82e4414">tempest-TestExecuteActionsViaActuator-1766821287-project-member</nova:user>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:project uuid="8b2fdb094fae4998b67f82aa76acda6a">tempest-TestExecuteActionsViaActuator-1766821287</nova:project>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         <nova:port uuid="b6fd14a3-d411-4ffa-92c3-a38e98e7e599">
Feb 23 10:55:26 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   </metadata>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <system>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <entry name="serial">ff2f3092-5677-4653-b466-6507edb18e01</entry>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <entry name="uuid">ff2f3092-5677-4653-b466-6507edb18e01</entry>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </system>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <os>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   </os>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <features>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <apic/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   </features>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   </clock>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk.config"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:61:12:3e"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <target dev="tapb6fd14a3-d4"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/console.log" append="off"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </serial>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <video>
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </video>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 10:55:26 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 10:55:26 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 10:55:26 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:55:26 compute-0 nova_compute[187639]: </domain>
Feb 23 10:55:26 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.025 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Preparing to wait for external event network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.025 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.025 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.026 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.027 187643 DEBUG nova.virt.libvirt.vif [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T10:55:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1848147015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1848147015',id=2,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-gpmzgv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:55:20Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=ff2f3092-5677-4653-b466-6507edb18e01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.027 187643 DEBUG nova.network.os_vif_util [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.028 187643 DEBUG nova.network.os_vif_util [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:12:3e,bridge_name='br-int',has_traffic_filtering=True,id=b6fd14a3-d411-4ffa-92c3-a38e98e7e599,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6fd14a3-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.029 187643 DEBUG os_vif [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:12:3e,bridge_name='br-int',has_traffic_filtering=True,id=b6fd14a3-d411-4ffa-92c3-a38e98e7e599,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6fd14a3-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.079 187643 DEBUG ovsdbapp.backend.ovs_idl [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.080 187643 DEBUG ovsdbapp.backend.ovs_idl [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.080 187643 DEBUG ovsdbapp.backend.ovs_idl [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.081 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.082 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.083 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.084 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.088 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.094 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.109 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.110 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.110 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.111 187643 INFO oslo.privsep.daemon [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp0fhr86b1/privsep.sock']
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.687 187643 INFO oslo.privsep.daemon [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Spawned new privsep daemon via rootwrap
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.591 208416 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.597 208416 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.600 208416 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.601 208416 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208416
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.956 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.958 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb6fd14a3-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.959 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb6fd14a3-d4, col_values=(('external_ids', {'iface-id': 'b6fd14a3-d411-4ffa-92c3-a38e98e7e599', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:12:3e', 'vm-uuid': 'ff2f3092-5677-4653-b466-6507edb18e01'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.961 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 NetworkManager[57207]: <info>  [1771844126.9625] manager: (tapb6fd14a3-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.965 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.968 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:26 compute-0 nova_compute[187639]: 2026-02-23 10:55:26.969 187643 INFO os_vif [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:12:3e,bridge_name='br-int',has_traffic_filtering=True,id=b6fd14a3-d411-4ffa-92c3-a38e98e7e599,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6fd14a3-d4')
Feb 23 10:55:27 compute-0 nova_compute[187639]: 2026-02-23 10:55:27.030 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:55:27 compute-0 nova_compute[187639]: 2026-02-23 10:55:27.031 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:55:27 compute-0 nova_compute[187639]: 2026-02-23 10:55:27.031 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] No VIF found with MAC fa:16:3e:61:12:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 10:55:27 compute-0 nova_compute[187639]: 2026-02-23 10:55:27.032 187643 INFO nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Using config drive
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.120 187643 INFO nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Creating config drive at /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk.config
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.125 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpgyfwhyya execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.248 187643 DEBUG oslo_concurrency.processutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpgyfwhyya" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:28 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 23 10:55:28 compute-0 kernel: tapb6fd14a3-d4: entered promiscuous mode
Feb 23 10:55:28 compute-0 NetworkManager[57207]: <info>  [1771844128.3241] manager: (tapb6fd14a3-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Feb 23 10:55:28 compute-0 ovn_controller[97601]: 2026-02-23T10:55:28Z|00027|binding|INFO|Claiming lport b6fd14a3-d411-4ffa-92c3-a38e98e7e599 for this chassis.
Feb 23 10:55:28 compute-0 ovn_controller[97601]: 2026-02-23T10:55:28Z|00028|binding|INFO|b6fd14a3-d411-4ffa-92c3-a38e98e7e599: Claiming fa:16:3e:61:12:3e 10.100.0.12
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.326 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.331 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:28 compute-0 systemd-udevd[208452]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.352 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:12:3e 10.100.0.12'], port_security=['fa:16:3e:61:12:3e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ff2f3092-5677-4653-b466-6507edb18e01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b6fd14a3-d411-4ffa-92c3-a38e98e7e599) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.355 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b6fd14a3-d411-4ffa-92c3-a38e98e7e599 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a bound to our chassis
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.359 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.360 106968 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpparj9ke8/privsep.sock']
Feb 23 10:55:28 compute-0 NetworkManager[57207]: <info>  [1771844128.3706] device (tapb6fd14a3-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:55:28 compute-0 NetworkManager[57207]: <info>  [1771844128.3713] device (tapb6fd14a3-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.374 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:28 compute-0 ovn_controller[97601]: 2026-02-23T10:55:28Z|00029|binding|INFO|Setting lport b6fd14a3-d411-4ffa-92c3-a38e98e7e599 ovn-installed in OVS
Feb 23 10:55:28 compute-0 ovn_controller[97601]: 2026-02-23T10:55:28Z|00030|binding|INFO|Setting lport b6fd14a3-d411-4ffa-92c3-a38e98e7e599 up in Southbound
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.377 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:28 compute-0 podman[208432]: 2026-02-23 10:55:28.393894922 +0000 UTC m=+0.081050458 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:55:28 compute-0 systemd-machined[156970]: New machine qemu-1-instance-00000002.
Feb 23 10:55:28 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.739 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844128.7394783, ff2f3092-5677-4653-b466-6507edb18e01 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.740 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] VM Started (Lifecycle Event)
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.770 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.774 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844128.7419965, ff2f3092-5677-4653-b466-6507edb18e01 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.774 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] VM Paused (Lifecycle Event)
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.802 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.804 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:55:28 compute-0 nova_compute[187639]: 2026-02-23 10:55:28.829 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.994 106968 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.995 106968 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpparj9ke8/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.902 208487 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.905 208487 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.907 208487 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.907 208487 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208487
Feb 23 10:55:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:28.997 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[92b68383-32e4-4ffc-8554-3ff443285948]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.109 187643 DEBUG nova.network.neutron [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Updated VIF entry in instance network info cache for port b6fd14a3-d411-4ffa-92c3-a38e98e7e599. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.109 187643 DEBUG nova.network.neutron [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Updating instance_info_cache with network_info: [{"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.122 187643 DEBUG oslo_concurrency.lockutils [req-03210dee-10a6-4385-aecd-748dfe4f57f8 req-537ee66b-3e16-4137-90e7-8e221c90711c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.279 187643 DEBUG nova.compute.manager [req-715b0a78-50dd-4eb1-b234-d704d637f087 req-b7b867b9-caef-4433-9aa7-5d21938294fc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received event network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.279 187643 DEBUG oslo_concurrency.lockutils [req-715b0a78-50dd-4eb1-b234-d704d637f087 req-b7b867b9-caef-4433-9aa7-5d21938294fc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.279 187643 DEBUG oslo_concurrency.lockutils [req-715b0a78-50dd-4eb1-b234-d704d637f087 req-b7b867b9-caef-4433-9aa7-5d21938294fc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.280 187643 DEBUG oslo_concurrency.lockutils [req-715b0a78-50dd-4eb1-b234-d704d637f087 req-b7b867b9-caef-4433-9aa7-5d21938294fc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.280 187643 DEBUG nova.compute.manager [req-715b0a78-50dd-4eb1-b234-d704d637f087 req-b7b867b9-caef-4433-9aa7-5d21938294fc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Processing event network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.280 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.282 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.302 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844129.3019419, ff2f3092-5677-4653-b466-6507edb18e01 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.302 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] VM Resumed (Lifecycle Event)
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.304 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.307 187643 INFO nova.virt.libvirt.driver [-] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Instance spawned successfully.
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.307 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.320 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.327 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.329 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.330 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.330 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.331 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.331 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.331 187643 DEBUG nova.virt.libvirt.driver [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.352 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.352 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.354 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.408 187643 INFO nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Took 8.48 seconds to spawn the instance on the hypervisor.
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.409 187643 DEBUG nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.419 208487 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.420 208487 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.420 208487 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.494 187643 INFO nova.compute.manager [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Took 8.95 seconds to build instance.
Feb 23 10:55:29 compute-0 nova_compute[187639]: 2026-02-23 10:55:29.684 187643 DEBUG oslo_concurrency.lockutils [None req-c917cad0-6f17-43e0-a2b2-04c15c8559f7 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:29 compute-0 podman[197002]: time="2026-02-23T10:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:55:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:55:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2155 "" "Go-http-client/1.1"
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.944 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[42a20330-3dc7-4010-863b-11ce573cc910]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.946 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa10ae0ff-b1 in ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.947 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa10ae0ff-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.947 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3995af55-006f-4088-b70e-5b66e7a95c9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.950 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d8ed8fbe-befd-480e-9841-a397b6a9155e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.970 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[dc58f2bb-0ec9-49dd-b262-a6401587e98b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.977 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe8744d-d47b-4c44-92ac-453b1a9fe126]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:29.979 106968 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpk5_rft83/privsep.sock']
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.538 106968 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.539 106968 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpk5_rft83/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.437 208501 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.441 208501 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.442 208501 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.442 208501 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208501
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.541 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[44bf242f-03ae-48d6-8bdd-84a0eefc07f9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.965 208501 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.965 208501 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:30.965 208501 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.380 187643 DEBUG nova.compute.manager [req-79577e74-b375-4773-9408-d685263c662d req-d45bad89-a28d-4c54-994d-2b272040a5d7 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received event network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.382 187643 DEBUG oslo_concurrency.lockutils [req-79577e74-b375-4773-9408-d685263c662d req-d45bad89-a28d-4c54-994d-2b272040a5d7 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.383 187643 DEBUG oslo_concurrency.lockutils [req-79577e74-b375-4773-9408-d685263c662d req-d45bad89-a28d-4c54-994d-2b272040a5d7 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.383 187643 DEBUG oslo_concurrency.lockutils [req-79577e74-b375-4773-9408-d685263c662d req-d45bad89-a28d-4c54-994d-2b272040a5d7 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.384 187643 DEBUG nova.compute.manager [req-79577e74-b375-4773-9408-d685263c662d req-d45bad89-a28d-4c54-994d-2b272040a5d7 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] No waiting events found dispatching network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.384 187643 WARNING nova.compute.manager [req-79577e74-b375-4773-9408-d685263c662d req-d45bad89-a28d-4c54-994d-2b272040a5d7 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received unexpected event network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 for instance with vm_state active and task_state None.
Feb 23 10:55:31 compute-0 openstack_network_exporter[199919]: ERROR   10:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:55:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:55:31 compute-0 openstack_network_exporter[199919]: ERROR   10:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:55:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.464 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[31afa000-8ed9-4b4f-9155-300dc2008040]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 NetworkManager[57207]: <info>  [1771844131.4799] manager: (tapa10ae0ff-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.480 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0c09f7-b2db-4804-9bb6-e04c0e6b5767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 systemd-udevd[208513]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.506 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[77a06b80-0ef8-4ee5-91ac-28afc81d8fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.510 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[2e118dc4-1983-47ac-a1b6-24758a2b6976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 NetworkManager[57207]: <info>  [1771844131.5393] device (tapa10ae0ff-b0): carrier: link connected
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.542 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d86cec-b09a-4e7b-966b-3d3eb1e87561]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.555 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a039cc48-1bde-458d-8416-0bd8c47e4c58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 24240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208531, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.566 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8065491e-63de-46d6-a6be-735d37c09a51]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feca:35da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327570, 'tstamp': 327570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208532, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.576 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ee770bcf-3f63-489a-b197-150524f20207]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 24240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208533, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.593 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[182ffc72-7003-475a-8664-75d4aabf62a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.626 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[242dcaf0-8e5e-4498-a6e5-26b8425e01d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.627 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.628 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.628 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa10ae0ff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.630 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:31 compute-0 NetworkManager[57207]: <info>  [1771844131.6323] manager: (tapa10ae0ff-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Feb 23 10:55:31 compute-0 kernel: tapa10ae0ff-b0: entered promiscuous mode
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.635 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.636 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa10ae0ff-b0, col_values=(('external_ids', {'iface-id': '1f2f61eb-a013-45e9-8854-1868e5df18d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.637 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:31 compute-0 ovn_controller[97601]: 2026-02-23T10:55:31Z|00031|binding|INFO|Releasing lport 1f2f61eb-a013-45e9-8854-1868e5df18d6 from this chassis (sb_readonly=0)
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.640 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.640 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a10ae0ff-ba31-43e9-bf1d-9df93406b21a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a10ae0ff-ba31-43e9-bf1d-9df93406b21a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 10:55:31 compute-0 nova_compute[187639]: 2026-02-23 10:55:31.642 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.641 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e239aeed-17e2-4bea-97fb-95dd0f93a4b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.643 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: global
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/a10ae0ff-ba31-43e9-bf1d-9df93406b21a.pid.haproxy
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 10:55:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:31.643 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'env', 'PROCESS_TAG=haproxy-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a10ae0ff-ba31-43e9-bf1d-9df93406b21a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 10:55:31 compute-0 podman[208566]: 2026-02-23 10:55:31.970219554 +0000 UTC m=+0.047616419 container create 42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 10:55:32 compute-0 podman[208566]: 2026-02-23 10:55:31.941170241 +0000 UTC m=+0.018567126 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 10:55:32 compute-0 nova_compute[187639]: 2026-02-23 10:55:32.137 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:32 compute-0 systemd[1]: Started libpod-conmon-42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046.scope.
Feb 23 10:55:32 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:55:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cd0b9747d6c24942de8f03e484fe68b8c1d6f449d88acd963564784b5a72458/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:55:32 compute-0 podman[208566]: 2026-02-23 10:55:32.200484114 +0000 UTC m=+0.277880999 container init 42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 10:55:32 compute-0 podman[208566]: 2026-02-23 10:55:32.204151471 +0000 UTC m=+0.281548336 container start 42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 10:55:32 compute-0 neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a[208581]: [NOTICE]   (208585) : New worker (208587) forked
Feb 23 10:55:32 compute-0 neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a[208581]: [NOTICE]   (208585) : Loading success.
Feb 23 10:55:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:32.242 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:55:33 compute-0 podman[208596]: 2026-02-23 10:55:33.891280962 +0000 UTC m=+0.098640627 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 10:55:34 compute-0 nova_compute[187639]: 2026-02-23 10:55:34.327 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:35.244 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:37 compute-0 nova_compute[187639]: 2026-02-23 10:55:37.138 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:37 compute-0 podman[208622]: 2026-02-23 10:55:37.838539748 +0000 UTC m=+0.045792761 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, architecture=x86_64, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 10:55:39 compute-0 nova_compute[187639]: 2026-02-23 10:55:39.328 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:39 compute-0 nova_compute[187639]: 2026-02-23 10:55:39.842 187643 DEBUG nova.compute.manager [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Feb 23 10:55:39 compute-0 nova_compute[187639]: 2026-02-23 10:55:39.965 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:39 compute-0 nova_compute[187639]: 2026-02-23 10:55:39.966 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.111 187643 DEBUG nova.objects.instance [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'pci_requests' on Instance uuid ce5f9655-093d-401a-8279-2affb3f9ea4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.133 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.133 187643 INFO nova.compute.claims [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Claim successful on node compute-0.ctlplane.example.com
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.134 187643 DEBUG nova.objects.instance [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'resources' on Instance uuid ce5f9655-093d-401a-8279-2affb3f9ea4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.145 187643 DEBUG nova.objects.instance [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'pci_devices' on Instance uuid ce5f9655-093d-401a-8279-2affb3f9ea4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.199 187643 INFO nova.compute.resource_tracker [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updating resource usage from migration 64a32f9f-ca71-48f1-89be-b239b8fa0c57
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.200 187643 DEBUG nova.compute.resource_tracker [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Starting to track incoming migration 64a32f9f-ca71-48f1-89be-b239b8fa0c57 with flavor 12382dfb-0aa4-43b0-b06c-e44eafc96c44 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.297 187643 DEBUG nova.compute.provider_tree [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.340 187643 ERROR nova.scheduler.client.report [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [req-c5239dac-3b6f-483f-b601-76b1c19fce4c] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 8ecb3de0-8241-4d60-9a57-9609e064b906.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-c5239dac-3b6f-483f-b601-76b1c19fce4c"}]}
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.363 187643 DEBUG nova.scheduler.client.report [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.390 187643 DEBUG nova.scheduler.client.report [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.391 187643 DEBUG nova.compute.provider_tree [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.411 187643 DEBUG nova.scheduler.client.report [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.442 187643 DEBUG nova.scheduler.client.report [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.519 187643 DEBUG nova.compute.provider_tree [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.574 187643 DEBUG nova.scheduler.client.report [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updated inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.575 187643 DEBUG nova.compute.provider_tree [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.575 187643 DEBUG nova.compute.provider_tree [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.606 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.606 187643 INFO nova.compute.manager [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Migrating
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.607 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.607 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.616 187643 INFO nova.compute.rpcapi [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Feb 23 10:55:40 compute-0 nova_compute[187639]: 2026-02-23 10:55:40.616 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:55:40 compute-0 ovn_controller[97601]: 2026-02-23T10:55:40Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:12:3e 10.100.0.12
Feb 23 10:55:40 compute-0 ovn_controller[97601]: 2026-02-23T10:55:40Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:12:3e 10.100.0.12
Feb 23 10:55:42 compute-0 nova_compute[187639]: 2026-02-23 10:55:42.139 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:42 compute-0 sshd-session[208657]: Accepted publickey for nova from 192.168.122.101 port 40722 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 10:55:42 compute-0 systemd-logind[808]: New session 27 of user nova.
Feb 23 10:55:42 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 10:55:42 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 10:55:42 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 10:55:42 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 10:55:42 compute-0 systemd[208661]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:55:42 compute-0 systemd[208661]: Queued start job for default target Main User Target.
Feb 23 10:55:42 compute-0 systemd[208661]: Created slice User Application Slice.
Feb 23 10:55:42 compute-0 systemd[208661]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 10:55:42 compute-0 systemd[208661]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 10:55:42 compute-0 systemd[208661]: Reached target Paths.
Feb 23 10:55:42 compute-0 systemd[208661]: Reached target Timers.
Feb 23 10:55:42 compute-0 systemd[208661]: Starting D-Bus User Message Bus Socket...
Feb 23 10:55:42 compute-0 systemd[208661]: Starting Create User's Volatile Files and Directories...
Feb 23 10:55:42 compute-0 systemd[208661]: Listening on D-Bus User Message Bus Socket.
Feb 23 10:55:42 compute-0 systemd[208661]: Reached target Sockets.
Feb 23 10:55:42 compute-0 systemd[208661]: Finished Create User's Volatile Files and Directories.
Feb 23 10:55:42 compute-0 systemd[208661]: Reached target Basic System.
Feb 23 10:55:42 compute-0 systemd[208661]: Reached target Main User Target.
Feb 23 10:55:42 compute-0 systemd[208661]: Startup finished in 162ms.
Feb 23 10:55:42 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 10:55:42 compute-0 systemd[1]: Started Session 27 of User nova.
Feb 23 10:55:42 compute-0 sshd-session[208657]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:55:42 compute-0 sshd-session[208676]: Received disconnect from 192.168.122.101 port 40722:11: disconnected by user
Feb 23 10:55:42 compute-0 sshd-session[208676]: Disconnected from user nova 192.168.122.101 port 40722
Feb 23 10:55:42 compute-0 sshd-session[208657]: pam_unix(sshd:session): session closed for user nova
Feb 23 10:55:42 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Feb 23 10:55:42 compute-0 systemd-logind[808]: Session 27 logged out. Waiting for processes to exit.
Feb 23 10:55:42 compute-0 systemd-logind[808]: Removed session 27.
Feb 23 10:55:43 compute-0 sshd-session[208678]: Accepted publickey for nova from 192.168.122.101 port 40734 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 10:55:43 compute-0 systemd-logind[808]: New session 29 of user nova.
Feb 23 10:55:43 compute-0 systemd[1]: Started Session 29 of User nova.
Feb 23 10:55:43 compute-0 sshd-session[208678]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:55:43 compute-0 sshd-session[208682]: Received disconnect from 192.168.122.101 port 40734:11: disconnected by user
Feb 23 10:55:43 compute-0 sshd-session[208682]: Disconnected from user nova 192.168.122.101 port 40734
Feb 23 10:55:43 compute-0 sshd-session[208678]: pam_unix(sshd:session): session closed for user nova
Feb 23 10:55:43 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Feb 23 10:55:43 compute-0 systemd-logind[808]: Session 29 logged out. Waiting for processes to exit.
Feb 23 10:55:43 compute-0 systemd-logind[808]: Removed session 29.
Feb 23 10:55:43 compute-0 sshd-session[208656]: Invalid user test from 80.94.95.116 port 61578
Feb 23 10:55:44 compute-0 nova_compute[187639]: 2026-02-23 10:55:44.331 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:44 compute-0 sshd-session[208656]: Connection closed by invalid user test 80.94.95.116 port 61578 [preauth]
Feb 23 10:55:46 compute-0 nova_compute[187639]: 2026-02-23 10:55:46.123 187643 DEBUG nova.compute.manager [req-d2f3b992-509f-47c6-a39b-d24ebf58b042 req-0195d272-384b-4992-9674-43533afb5b8b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-vif-unplugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:46 compute-0 nova_compute[187639]: 2026-02-23 10:55:46.124 187643 DEBUG oslo_concurrency.lockutils [req-d2f3b992-509f-47c6-a39b-d24ebf58b042 req-0195d272-384b-4992-9674-43533afb5b8b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:46 compute-0 nova_compute[187639]: 2026-02-23 10:55:46.124 187643 DEBUG oslo_concurrency.lockutils [req-d2f3b992-509f-47c6-a39b-d24ebf58b042 req-0195d272-384b-4992-9674-43533afb5b8b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:46 compute-0 nova_compute[187639]: 2026-02-23 10:55:46.124 187643 DEBUG oslo_concurrency.lockutils [req-d2f3b992-509f-47c6-a39b-d24ebf58b042 req-0195d272-384b-4992-9674-43533afb5b8b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:46 compute-0 nova_compute[187639]: 2026-02-23 10:55:46.124 187643 DEBUG nova.compute.manager [req-d2f3b992-509f-47c6-a39b-d24ebf58b042 req-0195d272-384b-4992-9674-43533afb5b8b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] No waiting events found dispatching network-vif-unplugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:55:46 compute-0 nova_compute[187639]: 2026-02-23 10:55:46.125 187643 WARNING nova.compute.manager [req-d2f3b992-509f-47c6-a39b-d24ebf58b042 req-0195d272-384b-4992-9674-43533afb5b8b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received unexpected event network-vif-unplugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 for instance with vm_state active and task_state resize_migrating.
Feb 23 10:55:46 compute-0 sshd-session[208685]: Accepted publickey for nova from 192.168.122.101 port 34792 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 10:55:46 compute-0 systemd-logind[808]: New session 30 of user nova.
Feb 23 10:55:46 compute-0 systemd[1]: Started Session 30 of User nova.
Feb 23 10:55:46 compute-0 sshd-session[208685]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:55:46 compute-0 nova_compute[187639]: 2026-02-23 10:55:46.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:46 compute-0 sshd-session[208688]: Received disconnect from 192.168.122.101 port 34792:11: disconnected by user
Feb 23 10:55:46 compute-0 sshd-session[208688]: Disconnected from user nova 192.168.122.101 port 34792
Feb 23 10:55:46 compute-0 sshd-session[208685]: pam_unix(sshd:session): session closed for user nova
Feb 23 10:55:46 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Feb 23 10:55:46 compute-0 systemd-logind[808]: Session 30 logged out. Waiting for processes to exit.
Feb 23 10:55:46 compute-0 systemd-logind[808]: Removed session 30.
Feb 23 10:55:46 compute-0 sshd-session[208690]: Accepted publickey for nova from 192.168.122.101 port 34808 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 10:55:46 compute-0 systemd-logind[808]: New session 31 of user nova.
Feb 23 10:55:46 compute-0 systemd[1]: Started Session 31 of User nova.
Feb 23 10:55:46 compute-0 sshd-session[208690]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:55:46 compute-0 sshd-session[208693]: Received disconnect from 192.168.122.101 port 34808:11: disconnected by user
Feb 23 10:55:46 compute-0 sshd-session[208693]: Disconnected from user nova 192.168.122.101 port 34808
Feb 23 10:55:46 compute-0 sshd-session[208690]: pam_unix(sshd:session): session closed for user nova
Feb 23 10:55:46 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Feb 23 10:55:46 compute-0 systemd-logind[808]: Session 31 logged out. Waiting for processes to exit.
Feb 23 10:55:46 compute-0 systemd-logind[808]: Removed session 31.
Feb 23 10:55:47 compute-0 sshd-session[208695]: Accepted publickey for nova from 192.168.122.101 port 34824 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 10:55:47 compute-0 systemd-logind[808]: New session 32 of user nova.
Feb 23 10:55:47 compute-0 systemd[1]: Started Session 32 of User nova.
Feb 23 10:55:47 compute-0 sshd-session[208695]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:55:47 compute-0 nova_compute[187639]: 2026-02-23 10:55:47.141 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:47 compute-0 sshd-session[208698]: Received disconnect from 192.168.122.101 port 34824:11: disconnected by user
Feb 23 10:55:47 compute-0 sshd-session[208698]: Disconnected from user nova 192.168.122.101 port 34824
Feb 23 10:55:47 compute-0 sshd-session[208695]: pam_unix(sshd:session): session closed for user nova
Feb 23 10:55:47 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Feb 23 10:55:47 compute-0 systemd-logind[808]: Session 32 logged out. Waiting for processes to exit.
Feb 23 10:55:47 compute-0 systemd-logind[808]: Removed session 32.
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.131 187643 INFO nova.network.neutron [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updating port 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.263 187643 DEBUG nova.compute.manager [req-603b5a73-5d4f-46d4-b097-71480aea89b8 req-38e5c483-a441-496b-9ba8-ac48845c8995 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.264 187643 DEBUG oslo_concurrency.lockutils [req-603b5a73-5d4f-46d4-b097-71480aea89b8 req-38e5c483-a441-496b-9ba8-ac48845c8995 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.265 187643 DEBUG oslo_concurrency.lockutils [req-603b5a73-5d4f-46d4-b097-71480aea89b8 req-38e5c483-a441-496b-9ba8-ac48845c8995 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.265 187643 DEBUG oslo_concurrency.lockutils [req-603b5a73-5d4f-46d4-b097-71480aea89b8 req-38e5c483-a441-496b-9ba8-ac48845c8995 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.265 187643 DEBUG nova.compute.manager [req-603b5a73-5d4f-46d4-b097-71480aea89b8 req-38e5c483-a441-496b-9ba8-ac48845c8995 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] No waiting events found dispatching network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.266 187643 WARNING nova.compute.manager [req-603b5a73-5d4f-46d4-b097-71480aea89b8 req-38e5c483-a441-496b-9ba8-ac48845c8995 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received unexpected event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 for instance with vm_state active and task_state resize_migrated.
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:48 compute-0 nova_compute[187639]: 2026-02-23 10:55:48.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.355 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.356 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.356 187643 DEBUG nova.network.neutron [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.362 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.451 187643 DEBUG nova.compute.manager [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-changed-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.451 187643 DEBUG nova.compute.manager [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Refreshing instance network info cache due to event network-changed-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.452 187643 DEBUG oslo_concurrency.lockutils [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:49 compute-0 nova_compute[187639]: 2026-02-23 10:55:49.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:55:50 compute-0 nova_compute[187639]: 2026-02-23 10:55:50.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:50 compute-0 nova_compute[187639]: 2026-02-23 10:55:50.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:55:50 compute-0 nova_compute[187639]: 2026-02-23 10:55:50.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:55:50 compute-0 nova_compute[187639]: 2026-02-23 10:55:50.804 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:55:50 compute-0 podman[208700]: 2026-02-23 10:55:50.84538359 +0000 UTC m=+0.049891629 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:55:51 compute-0 nova_compute[187639]: 2026-02-23 10:55:51.133 187643 DEBUG nova.network.neutron [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updating instance_info_cache with network_info: [{"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:55:51 compute-0 nova_compute[187639]: 2026-02-23 10:55:51.152 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:55:51 compute-0 nova_compute[187639]: 2026-02-23 10:55:51.155 187643 DEBUG oslo_concurrency.lockutils [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:55:51 compute-0 nova_compute[187639]: 2026-02-23 10:55:51.156 187643 DEBUG nova.network.neutron [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Refreshing network info cache for port 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.059 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.061 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.061 187643 INFO nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Creating image(s)
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.062 187643 DEBUG nova.objects.instance [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'trusted_certs' on Instance uuid ce5f9655-093d-401a-8279-2affb3f9ea4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.077 187643 DEBUG oslo_concurrency.processutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.132 187643 DEBUG oslo_concurrency.processutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.133 187643 DEBUG nova.virt.disk.api [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Checking if we can resize image /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.133 187643 DEBUG oslo_concurrency.processutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.180 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.182 187643 DEBUG oslo_concurrency.processutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.182 187643 DEBUG nova.virt.disk.api [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Cannot resize image /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.201 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.201 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Ensure instance console log exists: /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.201 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.202 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.202 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.204 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Start _get_guest_xml network_info=[{"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "vif_mac": "fa:16:3e:cf:97:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.209 187643 WARNING nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.216 187643 DEBUG nova.virt.libvirt.host [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.216 187643 DEBUG nova.virt.libvirt.host [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.219 187643 DEBUG nova.virt.libvirt.host [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.220 187643 DEBUG nova.virt.libvirt.host [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.222 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.223 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='12382dfb-0aa4-43b0-b06c-e44eafc96c44',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.223 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.224 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.224 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.225 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.225 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.226 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.226 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.227 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.227 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.228 187643 DEBUG nova.virt.hardware [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.228 187643 DEBUG nova.objects.instance [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'vcpu_model' on Instance uuid ce5f9655-093d-401a-8279-2affb3f9ea4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.256 187643 DEBUG oslo_concurrency.processutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.315 187643 DEBUG oslo_concurrency.processutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.316 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "/var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.317 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "/var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.318 187643 DEBUG oslo_concurrency.lockutils [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "/var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.320 187643 DEBUG nova.virt.libvirt.vif [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:55:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1457314767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1457314767',id=1,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-8g7unckd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:55:47Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=ce5f9655-093d-401a-8279-2affb3f9ea4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "vif_mac": "fa:16:3e:cf:97:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.321 187643 DEBUG nova.network.os_vif_util [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "vif_mac": "fa:16:3e:cf:97:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.322 187643 DEBUG nova.network.os_vif_util [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:97:04,bridge_name='br-int',has_traffic_filtering=True,id=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3335f10d-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.326 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] End _get_guest_xml xml=<domain type="kvm">
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <uuid>ce5f9655-093d-401a-8279-2affb3f9ea4c</uuid>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <name>instance-00000001</name>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <memory>196608</memory>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <metadata>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1457314767</nova:name>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 10:55:52</nova:creationTime>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <nova:flavor name="m1.micro">
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:memory>192</nova:memory>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:user uuid="38b4598a0d9649aaa7ba0cfac82e4414">tempest-TestExecuteActionsViaActuator-1766821287-project-member</nova:user>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:project uuid="8b2fdb094fae4998b67f82aa76acda6a">tempest-TestExecuteActionsViaActuator-1766821287</nova:project>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         <nova:port uuid="3335f10d-83b3-44e3-8ba7-ddb0cda1ff98">
Feb 23 10:55:52 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   </metadata>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <system>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <entry name="serial">ce5f9655-093d-401a-8279-2affb3f9ea4c</entry>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <entry name="uuid">ce5f9655-093d-401a-8279-2affb3f9ea4c</entry>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </system>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <os>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   </os>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <features>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <apic/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   </features>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   </clock>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk.config"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:cf:97:04"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <target dev="tap3335f10d-83"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/console.log" append="off"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </serial>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <video>
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </video>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 10:55:52 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 10:55:52 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 10:55:52 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:55:52 compute-0 nova_compute[187639]: </domain>
Feb 23 10:55:52 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.328 187643 DEBUG nova.virt.libvirt.vif [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:55:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1457314767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1457314767',id=1,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-8g7unckd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:55:47Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=ce5f9655-093d-401a-8279-2affb3f9ea4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "vif_mac": "fa:16:3e:cf:97:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.329 187643 DEBUG nova.network.os_vif_util [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "vif_mac": "fa:16:3e:cf:97:04"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.330 187643 DEBUG nova.network.os_vif_util [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:97:04,bridge_name='br-int',has_traffic_filtering=True,id=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3335f10d-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.330 187643 DEBUG os_vif [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:97:04,bridge_name='br-int',has_traffic_filtering=True,id=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3335f10d-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.331 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.332 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.332 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.335 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.335 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3335f10d-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.336 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3335f10d-83, col_values=(('external_ids', {'iface-id': '3335f10d-83b3-44e3-8ba7-ddb0cda1ff98', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:97:04', 'vm-uuid': 'ce5f9655-093d-401a-8279-2affb3f9ea4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.337 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 NetworkManager[57207]: <info>  [1771844152.3381] manager: (tap3335f10d-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.340 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.344 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.345 187643 INFO os_vif [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:97:04,bridge_name='br-int',has_traffic_filtering=True,id=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3335f10d-83')
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.398 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.398 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.399 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] No VIF found with MAC fa:16:3e:cf:97:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.399 187643 INFO nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Using config drive
Feb 23 10:55:52 compute-0 kernel: tap3335f10d-83: entered promiscuous mode
Feb 23 10:55:52 compute-0 NetworkManager[57207]: <info>  [1771844152.4291] manager: (tap3335f10d-83): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Feb 23 10:55:52 compute-0 ovn_controller[97601]: 2026-02-23T10:55:52Z|00032|binding|INFO|Claiming lport 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 for this chassis.
Feb 23 10:55:52 compute-0 ovn_controller[97601]: 2026-02-23T10:55:52Z|00033|binding|INFO|3335f10d-83b3-44e3-8ba7-ddb0cda1ff98: Claiming fa:16:3e:cf:97:04 10.100.0.10
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.431 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 ovn_controller[97601]: 2026-02-23T10:55:52Z|00034|binding|INFO|Setting lport 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 ovn-installed in OVS
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.435 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.437 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 ovn_controller[97601]: 2026-02-23T10:55:52Z|00035|binding|INFO|Setting lport 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 up in Southbound
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.439 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:97:04 10.100.0.10'], port_security=['fa:16:3e:cf:97:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce5f9655-093d-401a-8279-2affb3f9ea4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.440 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a bound to our chassis
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.441 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:55:52 compute-0 systemd-udevd[208751]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.453 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b359af32-4836-433c-a4db-fd5cf0508da7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:52 compute-0 systemd-machined[156970]: New machine qemu-2-instance-00000001.
Feb 23 10:55:52 compute-0 NetworkManager[57207]: <info>  [1771844152.4667] device (tap3335f10d-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:55:52 compute-0 NetworkManager[57207]: <info>  [1771844152.4680] device (tap3335f10d-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 10:55:52 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000001.
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.474 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[1be0069a-7875-41d4-93f4-4cb06e99a6a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.477 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[5101294a-2801-474a-9ee8-3269e466c952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.499 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[267b1190-00ed-48b6-9cce-282d088e1648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.511 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[56e37959-2563-4b62-860e-79ce5df75bf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 24240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208764, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.522 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[048c5852-22df-422e-92d6-2716edc24639]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327577, 'tstamp': 327577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208766, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327579, 'tstamp': 327579}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208766, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.524 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.525 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.527 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.527 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa10ae0ff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.527 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.528 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa10ae0ff-b0, col_values=(('external_ids', {'iface-id': '1f2f61eb-a013-45e9-8854-1868e5df18d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:55:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:55:52.528 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.771 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844152.7715762, ce5f9655-093d-401a-8279-2affb3f9ea4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.772 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] VM Resumed (Lifecycle Event)
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.774 187643 DEBUG nova.compute.manager [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.784 187643 INFO nova.virt.libvirt.driver [-] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Instance running successfully.
Feb 23 10:55:52 compute-0 virtqemud[186733]: argument unsupported: QEMU guest agent is not configured
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.786 187643 DEBUG nova.virt.libvirt.guest [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.787 187643 DEBUG nova.virt.libvirt.driver [None req-9ed745d5-cd48-4a2a-9bd2-258c848ec997 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.790 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.792 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.825 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] During sync_power_state the instance has a pending task (resize_finish). Skip.
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.825 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844152.780054, ce5f9655-093d-401a-8279-2affb3f9ea4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.826 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] VM Started (Lifecycle Event)
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.878 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:55:52 compute-0 nova_compute[187639]: 2026-02-23 10:55:52.881 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:55:54 compute-0 nova_compute[187639]: 2026-02-23 10:55:54.398 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:55 compute-0 nova_compute[187639]: 2026-02-23 10:55:55.431 187643 DEBUG nova.compute.manager [req-998b3e6a-7005-4e6f-a243-c79dd8bd82d6 req-b92fbd32-122f-40d1-a6c0-819846196d6d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:55 compute-0 nova_compute[187639]: 2026-02-23 10:55:55.431 187643 DEBUG oslo_concurrency.lockutils [req-998b3e6a-7005-4e6f-a243-c79dd8bd82d6 req-b92fbd32-122f-40d1-a6c0-819846196d6d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:55 compute-0 nova_compute[187639]: 2026-02-23 10:55:55.432 187643 DEBUG oslo_concurrency.lockutils [req-998b3e6a-7005-4e6f-a243-c79dd8bd82d6 req-b92fbd32-122f-40d1-a6c0-819846196d6d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:55 compute-0 nova_compute[187639]: 2026-02-23 10:55:55.432 187643 DEBUG oslo_concurrency.lockutils [req-998b3e6a-7005-4e6f-a243-c79dd8bd82d6 req-b92fbd32-122f-40d1-a6c0-819846196d6d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:55 compute-0 nova_compute[187639]: 2026-02-23 10:55:55.433 187643 DEBUG nova.compute.manager [req-998b3e6a-7005-4e6f-a243-c79dd8bd82d6 req-b92fbd32-122f-40d1-a6c0-819846196d6d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] No waiting events found dispatching network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:55:55 compute-0 nova_compute[187639]: 2026-02-23 10:55:55.433 187643 WARNING nova.compute.manager [req-998b3e6a-7005-4e6f-a243-c79dd8bd82d6 req-b92fbd32-122f-40d1-a6c0-819846196d6d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received unexpected event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 for instance with vm_state resized and task_state None.
Feb 23 10:55:55 compute-0 sshd-session[208775]: Connection closed by authenticating user root 143.198.30.3 port 43798 [preauth]
Feb 23 10:55:56 compute-0 sshd-session[208777]: Connection closed by authenticating user root 165.227.79.48 port 53350 [preauth]
Feb 23 10:55:57 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 10:55:57 compute-0 systemd[208661]: Activating special unit Exit the Session...
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped target Main User Target.
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped target Basic System.
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped target Paths.
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped target Sockets.
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped target Timers.
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 10:55:57 compute-0 systemd[208661]: Closed D-Bus User Message Bus Socket.
Feb 23 10:55:57 compute-0 systemd[208661]: Stopped Create User's Volatile Files and Directories.
Feb 23 10:55:57 compute-0 systemd[208661]: Removed slice User Application Slice.
Feb 23 10:55:57 compute-0 systemd[208661]: Reached target Shutdown.
Feb 23 10:55:57 compute-0 systemd[208661]: Finished Exit the Session.
Feb 23 10:55:57 compute-0 systemd[208661]: Reached target Exit the Session.
Feb 23 10:55:57 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 10:55:57 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 10:55:57 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 10:55:57 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 10:55:57 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 10:55:57 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 10:55:57 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.338 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.350 187643 DEBUG nova.network.neutron [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updated VIF entry in instance network info cache for port 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.351 187643 DEBUG nova.network.neutron [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updating instance_info_cache with network_info: [{"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.386 187643 DEBUG oslo_concurrency.lockutils [req-b9fc42dd-b94d-4020-8477-a01ebded7623 req-cfdc4af2-7cc5-4d5a-bc53-f5a525241a0c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.387 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.388 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.388 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce5f9655-093d-401a-8279-2affb3f9ea4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.598 187643 DEBUG nova.compute.manager [req-4217d4c8-69af-4968-8a39-2221c50364ae req-91b538c7-7166-49b2-9c9a-7f6d593e73fa 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.599 187643 DEBUG oslo_concurrency.lockutils [req-4217d4c8-69af-4968-8a39-2221c50364ae req-91b538c7-7166-49b2-9c9a-7f6d593e73fa 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.599 187643 DEBUG oslo_concurrency.lockutils [req-4217d4c8-69af-4968-8a39-2221c50364ae req-91b538c7-7166-49b2-9c9a-7f6d593e73fa 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.599 187643 DEBUG oslo_concurrency.lockutils [req-4217d4c8-69af-4968-8a39-2221c50364ae req-91b538c7-7166-49b2-9c9a-7f6d593e73fa 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.600 187643 DEBUG nova.compute.manager [req-4217d4c8-69af-4968-8a39-2221c50364ae req-91b538c7-7166-49b2-9c9a-7f6d593e73fa 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] No waiting events found dispatching network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:55:57 compute-0 nova_compute[187639]: 2026-02-23 10:55:57.600 187643 WARNING nova.compute.manager [req-4217d4c8-69af-4968-8a39-2221c50364ae req-91b538c7-7166-49b2-9c9a-7f6d593e73fa 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received unexpected event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 for instance with vm_state resized and task_state None.
Feb 23 10:55:58 compute-0 podman[208781]: 2026-02-23 10:55:58.849740387 +0000 UTC m=+0.046789937 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.120 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updating instance_info_cache with network_info: [{"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.228 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-ce5f9655-093d-401a-8279-2affb3f9ea4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.228 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.229 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.229 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.229 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.249 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.250 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.250 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.250 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.338 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.380 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.381 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.447 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.449 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.452 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.527 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.528 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.584 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.712 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.713 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5544MB free_disk=73.1478385925293GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.713 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.713 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:55:59 compute-0 podman[197002]: time="2026-02-23T10:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:55:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 10:55:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2626 "" "Go-http-client/1.1"
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.778 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Applying migration context for instance ce5f9655-093d-401a-8279-2affb3f9ea4c as it has an incoming, in-progress migration 64a32f9f-ca71-48f1-89be-b239b8fa0c57. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.779 187643 INFO nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updating resource usage from migration 64a32f9f-ca71-48f1-89be-b239b8fa0c57
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.808 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance ff2f3092-5677-4653-b466-6507edb18e01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.809 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance ce5f9655-093d-401a-8279-2affb3f9ea4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.809 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.809 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:55:59 compute-0 nova_compute[187639]: 2026-02-23 10:55:59.881 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:56:00 compute-0 nova_compute[187639]: 2026-02-23 10:56:00.039 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:56:00 compute-0 nova_compute[187639]: 2026-02-23 10:56:00.070 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:56:00 compute-0 nova_compute[187639]: 2026-02-23 10:56:00.071 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:01 compute-0 openstack_network_exporter[199919]: ERROR   10:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:56:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:56:01 compute-0 openstack_network_exporter[199919]: ERROR   10:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:56:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:56:02 compute-0 nova_compute[187639]: 2026-02-23 10:56:02.343 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:03 compute-0 nova_compute[187639]: 2026-02-23 10:56:03.066 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:04 compute-0 nova_compute[187639]: 2026-02-23 10:56:04.489 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:04 compute-0 podman[208822]: 2026-02-23 10:56:04.881407319 +0000 UTC m=+0.082404255 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 23 10:56:05 compute-0 ovn_controller[97601]: 2026-02-23T10:56:05Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:97:04 10.100.0.10
Feb 23 10:56:07 compute-0 nova_compute[187639]: 2026-02-23 10:56:07.345 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:08 compute-0 podman[208848]: 2026-02-23 10:56:08.843334474 +0000 UTC m=+0.051241125 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1770267347, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 10:56:09 compute-0 nova_compute[187639]: 2026-02-23 10:56:09.492 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:12 compute-0 nova_compute[187639]: 2026-02-23 10:56:12.346 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:12.638 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:12.639 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:12.640 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:14 compute-0 nova_compute[187639]: 2026-02-23 10:56:14.495 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:17 compute-0 nova_compute[187639]: 2026-02-23 10:56:17.348 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:19 compute-0 nova_compute[187639]: 2026-02-23 10:56:19.497 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.017 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.018 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.041 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.144 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.145 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.154 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.155 187643 INFO nova.compute.claims [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Claim successful on node compute-0.ctlplane.example.com
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.329 187643 DEBUG nova.compute.provider_tree [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.346 187643 DEBUG nova.scheduler.client.report [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.373 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.373 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.443 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.444 187643 DEBUG nova.network.neutron [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.559 187643 INFO nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.585 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.695 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.697 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.698 187643 INFO nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Creating image(s)
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.699 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "/var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.699 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "/var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.701 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "/var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.727 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.807 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.808 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.808 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.819 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.874 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.874 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.902 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.903 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.903 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.958 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.958 187643 DEBUG nova.virt.disk.api [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Checking if we can resize image /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 10:56:20 compute-0 nova_compute[187639]: 2026-02-23 10:56:20.959 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.012 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.013 187643 DEBUG nova.virt.disk.api [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Cannot resize image /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.013 187643 DEBUG nova.objects.instance [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'migration_context' on Instance uuid a9b337ca-0b37-41f6-bc6c-df32d188e518 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.026 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.026 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Ensure instance console log exists: /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.027 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.027 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.028 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:21 compute-0 nova_compute[187639]: 2026-02-23 10:56:21.139 187643 DEBUG nova.policy [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '38b4598a0d9649aaa7ba0cfac82e4414', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 10:56:21 compute-0 podman[208887]: 2026-02-23 10:56:21.851927717 +0000 UTC m=+0.049887012 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:56:22 compute-0 nova_compute[187639]: 2026-02-23 10:56:22.349 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:23 compute-0 nova_compute[187639]: 2026-02-23 10:56:23.274 187643 DEBUG nova.network.neutron [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Successfully created port: 558b51e6-cccf-4284-b026-65ada3d4aaa3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.539 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.845 187643 DEBUG nova.network.neutron [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Successfully updated port: 558b51e6-cccf-4284-b026-65ada3d4aaa3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.867 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "refresh_cache-a9b337ca-0b37-41f6-bc6c-df32d188e518" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.867 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquired lock "refresh_cache-a9b337ca-0b37-41f6-bc6c-df32d188e518" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.867 187643 DEBUG nova.network.neutron [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.946 187643 DEBUG nova.compute.manager [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received event network-changed-558b51e6-cccf-4284-b026-65ada3d4aaa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.946 187643 DEBUG nova.compute.manager [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Refreshing instance network info cache due to event network-changed-558b51e6-cccf-4284-b026-65ada3d4aaa3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 10:56:24 compute-0 nova_compute[187639]: 2026-02-23 10:56:24.947 187643 DEBUG oslo_concurrency.lockutils [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-a9b337ca-0b37-41f6-bc6c-df32d188e518" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:56:25 compute-0 nova_compute[187639]: 2026-02-23 10:56:25.070 187643 DEBUG nova.network.neutron [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.288 187643 DEBUG nova.network.neutron [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Updating instance_info_cache with network_info: [{"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.314 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Releasing lock "refresh_cache-a9b337ca-0b37-41f6-bc6c-df32d188e518" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.314 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Instance network_info: |[{"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.314 187643 DEBUG oslo_concurrency.lockutils [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-a9b337ca-0b37-41f6-bc6c-df32d188e518" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.315 187643 DEBUG nova.network.neutron [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Refreshing network info cache for port 558b51e6-cccf-4284-b026-65ada3d4aaa3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.317 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Start _get_guest_xml network_info=[{"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.323 187643 WARNING nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.335 187643 DEBUG nova.virt.libvirt.host [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.337 187643 DEBUG nova.virt.libvirt.host [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.343 187643 DEBUG nova.virt.libvirt.host [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.344 187643 DEBUG nova.virt.libvirt.host [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.345 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.346 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.347 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.347 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.348 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.348 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.349 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.349 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.350 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.350 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.350 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.351 187643 DEBUG nova.virt.hardware [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.356 187643 DEBUG nova.virt.libvirt.vif [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T10:56:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1976330993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1976330993',id=4,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-cer9rjz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:56:20Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=a9b337ca-0b37-41f6-bc6c-df32d188e518,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.357 187643 DEBUG nova.network.os_vif_util [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.358 187643 DEBUG nova.network.os_vif_util [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:8d:71,bridge_name='br-int',has_traffic_filtering=True,id=558b51e6-cccf-4284-b026-65ada3d4aaa3,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558b51e6-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.359 187643 DEBUG nova.objects.instance [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'pci_devices' on Instance uuid a9b337ca-0b37-41f6-bc6c-df32d188e518 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.372 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] End _get_guest_xml xml=<domain type="kvm">
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <uuid>a9b337ca-0b37-41f6-bc6c-df32d188e518</uuid>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <name>instance-00000004</name>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <metadata>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1976330993</nova:name>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 10:56:26</nova:creationTime>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:user uuid="38b4598a0d9649aaa7ba0cfac82e4414">tempest-TestExecuteActionsViaActuator-1766821287-project-member</nova:user>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:project uuid="8b2fdb094fae4998b67f82aa76acda6a">tempest-TestExecuteActionsViaActuator-1766821287</nova:project>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         <nova:port uuid="558b51e6-cccf-4284-b026-65ada3d4aaa3">
Feb 23 10:56:26 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   </metadata>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <system>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <entry name="serial">a9b337ca-0b37-41f6-bc6c-df32d188e518</entry>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <entry name="uuid">a9b337ca-0b37-41f6-bc6c-df32d188e518</entry>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </system>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <os>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   </os>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <features>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <apic/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   </features>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   </clock>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk.config"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:02:8d:71"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <target dev="tap558b51e6-cc"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/console.log" append="off"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </serial>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <video>
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </video>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 10:56:26 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 10:56:26 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 10:56:26 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:56:26 compute-0 nova_compute[187639]: </domain>
Feb 23 10:56:26 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.374 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Preparing to wait for external event network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.374 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.375 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.375 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.376 187643 DEBUG nova.virt.libvirt.vif [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T10:56:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1976330993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1976330993',id=4,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-cer9rjz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:56:20Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=a9b337ca-0b37-41f6-bc6c-df32d188e518,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.376 187643 DEBUG nova.network.os_vif_util [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.377 187643 DEBUG nova.network.os_vif_util [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:8d:71,bridge_name='br-int',has_traffic_filtering=True,id=558b51e6-cccf-4284-b026-65ada3d4aaa3,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558b51e6-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.377 187643 DEBUG os_vif [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:8d:71,bridge_name='br-int',has_traffic_filtering=True,id=558b51e6-cccf-4284-b026-65ada3d4aaa3,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558b51e6-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.377 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.378 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.378 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.381 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.381 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap558b51e6-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.381 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap558b51e6-cc, col_values=(('external_ids', {'iface-id': '558b51e6-cccf-4284-b026-65ada3d4aaa3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:8d:71', 'vm-uuid': 'a9b337ca-0b37-41f6-bc6c-df32d188e518'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.383 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:26 compute-0 NetworkManager[57207]: <info>  [1771844186.3839] manager: (tap558b51e6-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.385 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.390 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.391 187643 INFO os_vif [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:8d:71,bridge_name='br-int',has_traffic_filtering=True,id=558b51e6-cccf-4284-b026-65ada3d4aaa3,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558b51e6-cc')
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.444 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.444 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.445 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] No VIF found with MAC fa:16:3e:02:8d:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 10:56:26 compute-0 nova_compute[187639]: 2026-02-23 10:56:26.445 187643 INFO nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Using config drive
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.325 187643 INFO nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Creating config drive at /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk.config
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.330 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp37x7o8f7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.448 187643 DEBUG oslo_concurrency.processutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp37x7o8f7" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:27 compute-0 kernel: tap558b51e6-cc: entered promiscuous mode
Feb 23 10:56:27 compute-0 NetworkManager[57207]: <info>  [1771844187.4976] manager: (tap558b51e6-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Feb 23 10:56:27 compute-0 ovn_controller[97601]: 2026-02-23T10:56:27Z|00036|binding|INFO|Claiming lport 558b51e6-cccf-4284-b026-65ada3d4aaa3 for this chassis.
Feb 23 10:56:27 compute-0 ovn_controller[97601]: 2026-02-23T10:56:27Z|00037|binding|INFO|558b51e6-cccf-4284-b026-65ada3d4aaa3: Claiming fa:16:3e:02:8d:71 10.100.0.14
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.499 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.511 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:8d:71 10.100.0.14'], port_security=['fa:16:3e:02:8d:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a9b337ca-0b37-41f6-bc6c-df32d188e518', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=558b51e6-cccf-4284-b026-65ada3d4aaa3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.511 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:27 compute-0 ovn_controller[97601]: 2026-02-23T10:56:27Z|00038|binding|INFO|Setting lport 558b51e6-cccf-4284-b026-65ada3d4aaa3 ovn-installed in OVS
Feb 23 10:56:27 compute-0 ovn_controller[97601]: 2026-02-23T10:56:27Z|00039|binding|INFO|Setting lport 558b51e6-cccf-4284-b026-65ada3d4aaa3 up in Southbound
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.514 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 558b51e6-cccf-4284-b026-65ada3d4aaa3 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a bound to our chassis
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.514 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.517 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:56:27 compute-0 systemd-machined[156970]: New machine qemu-3-instance-00000004.
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.528 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ac81febb-66ff-4428-bb37-71eb3df61533]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.547 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[293f92d9-586a-4f19-a4c2-c9dc9da97013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.550 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[cda3ea42-0cb6-4c73-a3c3-1370a190a90d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:56:27 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Feb 23 10:56:27 compute-0 systemd-udevd[208936]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.569 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[52e3b795-05d4-4708-9296-39e754bc8b10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:56:27 compute-0 NetworkManager[57207]: <info>  [1771844187.5706] device (tap558b51e6-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:56:27 compute-0 NetworkManager[57207]: <info>  [1771844187.5712] device (tap558b51e6-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.581 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[700236bd-241e-44cf-82e6-6d0672f0fe88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 27640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208940, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.593 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a7105cee-cbaa-4b0d-98bf-9eaff4aa9297]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327577, 'tstamp': 327577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208942, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327579, 'tstamp': 327579}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208942, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.594 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.597 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa10ae0ff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.597 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.598 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.598 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa10ae0ff-b0, col_values=(('external_ids', {'iface-id': '1f2f61eb-a013-45e9-8854-1868e5df18d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:56:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:27.598 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.827 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844187.826314, a9b337ca-0b37-41f6-bc6c-df32d188e518 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.828 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] VM Started (Lifecycle Event)
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.864 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.869 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844187.8264558, a9b337ca-0b37-41f6-bc6c-df32d188e518 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.869 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] VM Paused (Lifecycle Event)
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.889 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.893 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:56:27 compute-0 nova_compute[187639]: 2026-02-23 10:56:27.916 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 10:56:27 compute-0 sshd-session[208956]: Connection closed by authenticating user root 143.198.30.3 port 53780 [preauth]
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.264 187643 DEBUG nova.compute.manager [req-a77d116e-4f96-4f38-b0be-b1087b28c159 req-29289278-2197-4e96-972a-f565342dc472 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received event network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.265 187643 DEBUG oslo_concurrency.lockutils [req-a77d116e-4f96-4f38-b0be-b1087b28c159 req-29289278-2197-4e96-972a-f565342dc472 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.265 187643 DEBUG oslo_concurrency.lockutils [req-a77d116e-4f96-4f38-b0be-b1087b28c159 req-29289278-2197-4e96-972a-f565342dc472 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.265 187643 DEBUG oslo_concurrency.lockutils [req-a77d116e-4f96-4f38-b0be-b1087b28c159 req-29289278-2197-4e96-972a-f565342dc472 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.265 187643 DEBUG nova.compute.manager [req-a77d116e-4f96-4f38-b0be-b1087b28c159 req-29289278-2197-4e96-972a-f565342dc472 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Processing event network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.266 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.269 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844188.2694786, a9b337ca-0b37-41f6-bc6c-df32d188e518 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.269 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] VM Resumed (Lifecycle Event)
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.271 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.274 187643 INFO nova.virt.libvirt.driver [-] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Instance spawned successfully.
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.275 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.288 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.294 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.297 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.297 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.297 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.298 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.298 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.298 187643 DEBUG nova.virt.libvirt.driver [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.328 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.350 187643 DEBUG nova.network.neutron [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Updated VIF entry in instance network info cache for port 558b51e6-cccf-4284-b026-65ada3d4aaa3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.351 187643 DEBUG nova.network.neutron [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Updating instance_info_cache with network_info: [{"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.357 187643 INFO nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Took 7.66 seconds to spawn the instance on the hypervisor.
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.358 187643 DEBUG nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.368 187643 DEBUG oslo_concurrency.lockutils [req-4988395a-4892-4a69-b58a-1b072ecc9124 req-2574c0d0-bb7a-448a-9dbd-a138866add99 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-a9b337ca-0b37-41f6-bc6c-df32d188e518" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.422 187643 INFO nova.compute.manager [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Took 8.32 seconds to build instance.
Feb 23 10:56:28 compute-0 nova_compute[187639]: 2026-02-23 10:56:28.438 187643 DEBUG oslo_concurrency.lockutils [None req-968d2435-8e98-40d0-83f2-e9395ed27f8e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:29 compute-0 nova_compute[187639]: 2026-02-23 10:56:29.541 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:29 compute-0 podman[197002]: time="2026-02-23T10:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:56:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 10:56:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2624 "" "Go-http-client/1.1"
Feb 23 10:56:29 compute-0 podman[208967]: 2026-02-23 10:56:29.865642045 +0000 UTC m=+0.055804881 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 10:56:30 compute-0 nova_compute[187639]: 2026-02-23 10:56:30.359 187643 DEBUG nova.compute.manager [req-c0835ba6-0f8c-47ec-b7c3-f68027b05ec6 req-dda90c1d-1b72-43a4-9ca5-9cb8023cb95e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received event network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:56:30 compute-0 nova_compute[187639]: 2026-02-23 10:56:30.359 187643 DEBUG oslo_concurrency.lockutils [req-c0835ba6-0f8c-47ec-b7c3-f68027b05ec6 req-dda90c1d-1b72-43a4-9ca5-9cb8023cb95e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:30 compute-0 nova_compute[187639]: 2026-02-23 10:56:30.360 187643 DEBUG oslo_concurrency.lockutils [req-c0835ba6-0f8c-47ec-b7c3-f68027b05ec6 req-dda90c1d-1b72-43a4-9ca5-9cb8023cb95e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:30 compute-0 nova_compute[187639]: 2026-02-23 10:56:30.360 187643 DEBUG oslo_concurrency.lockutils [req-c0835ba6-0f8c-47ec-b7c3-f68027b05ec6 req-dda90c1d-1b72-43a4-9ca5-9cb8023cb95e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:30 compute-0 nova_compute[187639]: 2026-02-23 10:56:30.360 187643 DEBUG nova.compute.manager [req-c0835ba6-0f8c-47ec-b7c3-f68027b05ec6 req-dda90c1d-1b72-43a4-9ca5-9cb8023cb95e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] No waiting events found dispatching network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:56:30 compute-0 nova_compute[187639]: 2026-02-23 10:56:30.361 187643 WARNING nova.compute.manager [req-c0835ba6-0f8c-47ec-b7c3-f68027b05ec6 req-dda90c1d-1b72-43a4-9ca5-9cb8023cb95e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received unexpected event network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 for instance with vm_state active and task_state None.
Feb 23 10:56:31 compute-0 nova_compute[187639]: 2026-02-23 10:56:31.384 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:31 compute-0 openstack_network_exporter[199919]: ERROR   10:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:56:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:56:31 compute-0 openstack_network_exporter[199919]: ERROR   10:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:56:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:56:34 compute-0 nova_compute[187639]: 2026-02-23 10:56:34.586 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:35 compute-0 podman[208988]: 2026-02-23 10:56:35.897041118 +0000 UTC m=+0.102685002 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 23 10:56:36 compute-0 nova_compute[187639]: 2026-02-23 10:56:36.386 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:38 compute-0 ovn_controller[97601]: 2026-02-23T10:56:38Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:02:8d:71 10.100.0.14
Feb 23 10:56:38 compute-0 ovn_controller[97601]: 2026-02-23T10:56:38Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:8d:71 10.100.0.14
Feb 23 10:56:39 compute-0 nova_compute[187639]: 2026-02-23 10:56:39.628 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:39 compute-0 sshd-session[209026]: Connection closed by authenticating user root 165.227.79.48 port 34718 [preauth]
Feb 23 10:56:39 compute-0 podman[209028]: 2026-02-23 10:56:39.870407057 +0000 UTC m=+0.063182550 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git)
Feb 23 10:56:40 compute-0 nova_compute[187639]: 2026-02-23 10:56:40.204 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:40.207 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:56:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:40.208 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:56:41 compute-0 nova_compute[187639]: 2026-02-23 10:56:41.389 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:44 compute-0 nova_compute[187639]: 2026-02-23 10:56:44.669 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:46 compute-0 nova_compute[187639]: 2026-02-23 10:56:46.391 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:46 compute-0 nova_compute[187639]: 2026-02-23 10:56:46.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:56:49.209 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:56:49 compute-0 nova_compute[187639]: 2026-02-23 10:56:49.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:49 compute-0 nova_compute[187639]: 2026-02-23 10:56:49.708 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:50 compute-0 nova_compute[187639]: 2026-02-23 10:56:50.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:50 compute-0 nova_compute[187639]: 2026-02-23 10:56:50.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:50 compute-0 nova_compute[187639]: 2026-02-23 10:56:50.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:50 compute-0 nova_compute[187639]: 2026-02-23 10:56:50.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:56:51 compute-0 nova_compute[187639]: 2026-02-23 10:56:51.395 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:51 compute-0 nova_compute[187639]: 2026-02-23 10:56:51.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:52 compute-0 nova_compute[187639]: 2026-02-23 10:56:52.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:52 compute-0 nova_compute[187639]: 2026-02-23 10:56:52.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:56:52 compute-0 podman[209049]: 2026-02-23 10:56:52.868290182 +0000 UTC m=+0.063986361 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:56:53 compute-0 nova_compute[187639]: 2026-02-23 10:56:53.180 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:56:53 compute-0 nova_compute[187639]: 2026-02-23 10:56:53.180 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:56:53 compute-0 nova_compute[187639]: 2026-02-23 10:56:53.181 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.746 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.760 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Updating instance_info_cache with network_info: [{"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.779 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-ff2f3092-5677-4653-b466-6507edb18e01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.779 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.780 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.780 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.807 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.807 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.807 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.807 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.882 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.957 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:54 compute-0 nova_compute[187639]: 2026-02-23 10:56:54.959 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.016 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.023 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.086 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.087 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.148 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.153 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.225 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.226 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.282 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.406 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.408 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5386MB free_disk=73.1206283569336GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.408 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.408 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.486 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance ff2f3092-5677-4653-b466-6507edb18e01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.487 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance ce5f9655-093d-401a-8279-2affb3f9ea4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.487 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance a9b337ca-0b37-41f6-bc6c-df32d188e518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.487 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.487 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.549 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.566 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.587 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:56:55 compute-0 nova_compute[187639]: 2026-02-23 10:56:55.587 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:56:56 compute-0 nova_compute[187639]: 2026-02-23 10:56:56.397 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:59 compute-0 sshd-session[209092]: Invalid user admin from 143.198.30.3 port 53288
Feb 23 10:56:59 compute-0 sshd-session[209092]: Connection closed by invalid user admin 143.198.30.3 port 53288 [preauth]
Feb 23 10:56:59 compute-0 podman[197002]: time="2026-02-23T10:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:56:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 10:56:59 compute-0 nova_compute[187639]: 2026-02-23 10:56:59.748 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:56:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2628 "" "Go-http-client/1.1"
Feb 23 10:57:00 compute-0 podman[209094]: 2026-02-23 10:57:00.870247976 +0000 UTC m=+0.065819460 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Feb 23 10:57:01 compute-0 nova_compute[187639]: 2026-02-23 10:57:01.400 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:01 compute-0 openstack_network_exporter[199919]: ERROR   10:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:57:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:57:01 compute-0 openstack_network_exporter[199919]: ERROR   10:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:57:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:57:04 compute-0 nova_compute[187639]: 2026-02-23 10:57:04.751 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:06 compute-0 nova_compute[187639]: 2026-02-23 10:57:06.403 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:06 compute-0 podman[209126]: 2026-02-23 10:57:06.869515213 +0000 UTC m=+0.073231580 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:57:09 compute-0 nova_compute[187639]: 2026-02-23 10:57:09.753 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:10 compute-0 podman[209153]: 2026-02-23 10:57:10.84548839 +0000 UTC m=+0.051377772 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1770267347, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 10:57:11 compute-0 nova_compute[187639]: 2026-02-23 10:57:11.405 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:12.639 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:12.639 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:12.640 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:14 compute-0 nova_compute[187639]: 2026-02-23 10:57:14.625 187643 DEBUG nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Creating tmpfile /var/lib/nova/instances/tmpmlq1l91w to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 23 10:57:14 compute-0 nova_compute[187639]: 2026-02-23 10:57:14.708 187643 DEBUG nova.compute.manager [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmlq1l91w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 23 10:57:14 compute-0 nova_compute[187639]: 2026-02-23 10:57:14.755 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:15 compute-0 nova_compute[187639]: 2026-02-23 10:57:15.648 187643 DEBUG nova.compute.manager [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmlq1l91w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='81523704-1367-466b-8876-682fba2244b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 23 10:57:15 compute-0 nova_compute[187639]: 2026-02-23 10:57:15.683 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-81523704-1367-466b-8876-682fba2244b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:57:15 compute-0 nova_compute[187639]: 2026-02-23 10:57:15.683 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-81523704-1367-466b-8876-682fba2244b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:57:15 compute-0 nova_compute[187639]: 2026-02-23 10:57:15.683 187643 DEBUG nova.network.neutron [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 10:57:16 compute-0 nova_compute[187639]: 2026-02-23 10:57:16.409 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.203 187643 DEBUG nova.network.neutron [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Updating instance_info_cache with network_info: [{"id": "718140bb-9a89-42fb-8a51-295ef9667227", "address": "fa:16:3e:03:5e:59", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap718140bb-9a", "ovs_interfaceid": "718140bb-9a89-42fb-8a51-295ef9667227", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.225 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-81523704-1367-466b-8876-682fba2244b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.226 187643 DEBUG nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmlq1l91w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='81523704-1367-466b-8876-682fba2244b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.227 187643 DEBUG nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Creating instance directory: /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.227 187643 DEBUG nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Creating disk.info with the contents: {'/var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk': 'qcow2', '/var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.228 187643 DEBUG nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.228 187643 DEBUG nova.objects.instance [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 81523704-1367-466b-8876-682fba2244b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.258 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.336 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.338 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.339 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.363 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.433 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.435 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.469 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.471 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.472 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.539 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.540 187643 DEBUG nova.virt.disk.api [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Checking if we can resize image /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.540 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.597 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.598 187643 DEBUG nova.virt.disk.api [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Cannot resize image /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.599 187643 DEBUG nova.objects.instance [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid 81523704-1367-466b-8876-682fba2244b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.617 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.632 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk.config 485376" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.634 187643 DEBUG nova.virt.libvirt.volume.remotefs [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk.config to /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.634 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk.config /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.994 187643 DEBUG oslo_concurrency.processutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9/disk.config /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9" returned: 0 in 0.360s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.995 187643 DEBUG nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.997 187643 DEBUG nova.virt.libvirt.vif [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:56:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-937316806',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-937316806',id=3,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:56:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-gs33w1pe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:56:14Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=81523704-1367-466b-8876-682fba2244b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "718140bb-9a89-42fb-8a51-295ef9667227", "address": "fa:16:3e:03:5e:59", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap718140bb-9a", "ovs_interfaceid": "718140bb-9a89-42fb-8a51-295ef9667227", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 10:57:17 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.998 187643 DEBUG nova.network.os_vif_util [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "718140bb-9a89-42fb-8a51-295ef9667227", "address": "fa:16:3e:03:5e:59", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap718140bb-9a", "ovs_interfaceid": "718140bb-9a89-42fb-8a51-295ef9667227", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:17.999 187643 DEBUG nova.network.os_vif_util [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:5e:59,bridge_name='br-int',has_traffic_filtering=True,id=718140bb-9a89-42fb-8a51-295ef9667227,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap718140bb-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.000 187643 DEBUG os_vif [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5e:59,bridge_name='br-int',has_traffic_filtering=True,id=718140bb-9a89-42fb-8a51-295ef9667227,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap718140bb-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.001 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.002 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.003 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.006 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.007 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap718140bb-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.008 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap718140bb-9a, col_values=(('external_ids', {'iface-id': '718140bb-9a89-42fb-8a51-295ef9667227', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:5e:59', 'vm-uuid': '81523704-1367-466b-8876-682fba2244b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.010 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:18 compute-0 NetworkManager[57207]: <info>  [1771844238.0111] manager: (tap718140bb-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.013 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.015 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.016 187643 INFO os_vif [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5e:59,bridge_name='br-int',has_traffic_filtering=True,id=718140bb-9a89-42fb-8a51-295ef9667227,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap718140bb-9a')
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.017 187643 DEBUG nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 23 10:57:18 compute-0 nova_compute[187639]: 2026-02-23 10:57:18.018 187643 DEBUG nova.compute.manager [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmlq1l91w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='81523704-1367-466b-8876-682fba2244b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 23 10:57:19 compute-0 nova_compute[187639]: 2026-02-23 10:57:19.757 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:22 compute-0 nova_compute[187639]: 2026-02-23 10:57:22.190 187643 DEBUG nova.network.neutron [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Port 718140bb-9a89-42fb-8a51-295ef9667227 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 23 10:57:22 compute-0 nova_compute[187639]: 2026-02-23 10:57:22.192 187643 DEBUG nova.compute.manager [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmlq1l91w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='81523704-1367-466b-8876-682fba2244b9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 23 10:57:22 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 23 10:57:22 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 23 10:57:22 compute-0 kernel: tap718140bb-9a: entered promiscuous mode
Feb 23 10:57:22 compute-0 NetworkManager[57207]: <info>  [1771844242.5115] manager: (tap718140bb-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Feb 23 10:57:22 compute-0 ovn_controller[97601]: 2026-02-23T10:57:22Z|00040|binding|INFO|Claiming lport 718140bb-9a89-42fb-8a51-295ef9667227 for this additional chassis.
Feb 23 10:57:22 compute-0 ovn_controller[97601]: 2026-02-23T10:57:22Z|00041|binding|INFO|718140bb-9a89-42fb-8a51-295ef9667227: Claiming fa:16:3e:03:5e:59 10.100.0.11
Feb 23 10:57:22 compute-0 nova_compute[187639]: 2026-02-23 10:57:22.514 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:22 compute-0 ovn_controller[97601]: 2026-02-23T10:57:22Z|00042|binding|INFO|Setting lport 718140bb-9a89-42fb-8a51-295ef9667227 ovn-installed in OVS
Feb 23 10:57:22 compute-0 nova_compute[187639]: 2026-02-23 10:57:22.524 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:22 compute-0 nova_compute[187639]: 2026-02-23 10:57:22.527 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:22 compute-0 systemd-udevd[209231]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:57:22 compute-0 systemd-machined[156970]: New machine qemu-4-instance-00000003.
Feb 23 10:57:22 compute-0 NetworkManager[57207]: <info>  [1771844242.5567] device (tap718140bb-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:57:22 compute-0 NetworkManager[57207]: <info>  [1771844242.5572] device (tap718140bb-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 10:57:22 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000003.
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.011 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.140 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844243.1397257, 81523704-1367-466b-8876-682fba2244b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.140 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 81523704-1367-466b-8876-682fba2244b9] VM Started (Lifecycle Event)
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.166 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 81523704-1367-466b-8876-682fba2244b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:57:23 compute-0 sshd-session[209260]: Connection closed by authenticating user root 165.227.79.48 port 52936 [preauth]
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.812 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844243.8122785, 81523704-1367-466b-8876-682fba2244b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.813 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 81523704-1367-466b-8876-682fba2244b9] VM Resumed (Lifecycle Event)
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.834 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 81523704-1367-466b-8876-682fba2244b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.838 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 81523704-1367-466b-8876-682fba2244b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:57:23 compute-0 nova_compute[187639]: 2026-02-23 10:57:23.865 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 81523704-1367-466b-8876-682fba2244b9] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 23 10:57:23 compute-0 podman[209262]: 2026-02-23 10:57:23.876405613 +0000 UTC m=+0.074331980 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:57:24 compute-0 nova_compute[187639]: 2026-02-23 10:57:24.790 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:25 compute-0 ovn_controller[97601]: 2026-02-23T10:57:25Z|00043|binding|INFO|Claiming lport 718140bb-9a89-42fb-8a51-295ef9667227 for this chassis.
Feb 23 10:57:25 compute-0 ovn_controller[97601]: 2026-02-23T10:57:25Z|00044|binding|INFO|718140bb-9a89-42fb-8a51-295ef9667227: Claiming fa:16:3e:03:5e:59 10.100.0.11
Feb 23 10:57:25 compute-0 ovn_controller[97601]: 2026-02-23T10:57:25Z|00045|binding|INFO|Setting lport 718140bb-9a89-42fb-8a51-295ef9667227 up in Southbound
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.216 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5e:59 10.100.0.11'], port_security=['fa:16:3e:03:5e:59 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '81523704-1367-466b-8876-682fba2244b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '11', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=718140bb-9a89-42fb-8a51-295ef9667227) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.217 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 718140bb-9a89-42fb-8a51-295ef9667227 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a bound to our chassis
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.219 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.227 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c29949-e07d-40e8-ac35-5135fd44de45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.243 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf1e857-959d-4727-8281-4c88b4fac5ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.246 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[6fde07bf-19c2-423c-a326-aacf859bc2b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.271 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5519d1-bd1c-4de9-9501-f8ddffdc8d2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.289 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[188c762c-ba0f-47f6-9943-aa602bd8e415]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 27640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209291, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.300 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b5019d27-7379-4a29-9f1c-cbce4337620e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327577, 'tstamp': 327577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209292, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327579, 'tstamp': 327579}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209292, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.302 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:25 compute-0 nova_compute[187639]: 2026-02-23 10:57:25.304 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.307 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa10ae0ff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.307 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.308 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa10ae0ff-b0, col_values=(('external_ids', {'iface-id': '1f2f61eb-a013-45e9-8854-1868e5df18d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:25.308 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:25 compute-0 nova_compute[187639]: 2026-02-23 10:57:25.356 187643 INFO nova.compute.manager [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Post operation of migration started
Feb 23 10:57:25 compute-0 nova_compute[187639]: 2026-02-23 10:57:25.662 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-81523704-1367-466b-8876-682fba2244b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:57:25 compute-0 nova_compute[187639]: 2026-02-23 10:57:25.662 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-81523704-1367-466b-8876-682fba2244b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:57:25 compute-0 nova_compute[187639]: 2026-02-23 10:57:25.662 187643 DEBUG nova.network.neutron [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 10:57:27 compute-0 nova_compute[187639]: 2026-02-23 10:57:27.226 187643 DEBUG nova.network.neutron [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Updating instance_info_cache with network_info: [{"id": "718140bb-9a89-42fb-8a51-295ef9667227", "address": "fa:16:3e:03:5e:59", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap718140bb-9a", "ovs_interfaceid": "718140bb-9a89-42fb-8a51-295ef9667227", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:57:27 compute-0 nova_compute[187639]: 2026-02-23 10:57:27.262 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-81523704-1367-466b-8876-682fba2244b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:57:27 compute-0 nova_compute[187639]: 2026-02-23 10:57:27.279 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:27 compute-0 nova_compute[187639]: 2026-02-23 10:57:27.280 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:27 compute-0 nova_compute[187639]: 2026-02-23 10:57:27.280 187643 DEBUG oslo_concurrency.lockutils [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:27 compute-0 nova_compute[187639]: 2026-02-23 10:57:27.286 187643 INFO nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 23 10:57:27 compute-0 virtqemud[186733]: Domain id=4 name='instance-00000003' uuid=81523704-1367-466b-8876-682fba2244b9 is tainted: custom-monitor
Feb 23 10:57:28 compute-0 nova_compute[187639]: 2026-02-23 10:57:28.013 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:28 compute-0 nova_compute[187639]: 2026-02-23 10:57:28.294 187643 INFO nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 23 10:57:29 compute-0 nova_compute[187639]: 2026-02-23 10:57:29.300 187643 INFO nova.virt.libvirt.driver [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 23 10:57:29 compute-0 nova_compute[187639]: 2026-02-23 10:57:29.304 187643 DEBUG nova.compute.manager [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:57:29 compute-0 nova_compute[187639]: 2026-02-23 10:57:29.323 187643 DEBUG nova.objects.instance [None req-e4111172-d960-4583-9860-419bdff310b3 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 23 10:57:29 compute-0 podman[197002]: time="2026-02-23T10:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:57:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 10:57:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2631 "" "Go-http-client/1.1"
Feb 23 10:57:29 compute-0 nova_compute[187639]: 2026-02-23 10:57:29.792 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:31 compute-0 openstack_network_exporter[199919]: ERROR   10:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:57:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:57:31 compute-0 openstack_network_exporter[199919]: ERROR   10:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:57:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:57:31 compute-0 podman[209294]: 2026-02-23 10:57:31.856501159 +0000 UTC m=+0.053143980 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:57:32 compute-0 sshd-session[209317]: Invalid user admin from 143.198.30.3 port 55914
Feb 23 10:57:32 compute-0 sshd-session[209317]: Connection closed by invalid user admin 143.198.30.3 port 55914 [preauth]
Feb 23 10:57:33 compute-0 nova_compute[187639]: 2026-02-23 10:57:33.015 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:34 compute-0 nova_compute[187639]: 2026-02-23 10:57:34.832 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:37 compute-0 podman[209319]: 2026-02-23 10:57:37.856363354 +0000 UTC m=+0.062992945 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.926 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.927 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.927 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.928 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.928 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.929 187643 INFO nova.compute.manager [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Terminating instance
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.930 187643 DEBUG nova.compute.manager [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 10:57:37 compute-0 kernel: tap558b51e6-cc (unregistering): left promiscuous mode
Feb 23 10:57:37 compute-0 NetworkManager[57207]: <info>  [1771844257.9536] device (tap558b51e6-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 10:57:37 compute-0 ovn_controller[97601]: 2026-02-23T10:57:37Z|00046|binding|INFO|Releasing lport 558b51e6-cccf-4284-b026-65ada3d4aaa3 from this chassis (sb_readonly=0)
Feb 23 10:57:37 compute-0 ovn_controller[97601]: 2026-02-23T10:57:37Z|00047|binding|INFO|Setting lport 558b51e6-cccf-4284-b026-65ada3d4aaa3 down in Southbound
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.955 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:37 compute-0 ovn_controller[97601]: 2026-02-23T10:57:37Z|00048|binding|INFO|Removing iface tap558b51e6-cc ovn-installed in OVS
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.958 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:37 compute-0 nova_compute[187639]: 2026-02-23 10:57:37.966 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:37.963 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:8d:71 10.100.0.14'], port_security=['fa:16:3e:02:8d:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a9b337ca-0b37-41f6-bc6c-df32d188e518', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=558b51e6-cccf-4284-b026-65ada3d4aaa3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:57:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:37.964 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 558b51e6-cccf-4284-b026-65ada3d4aaa3 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a unbound from our chassis
Feb 23 10:57:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:37.966 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:57:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:37.976 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[65c23b06-bf79-4c9d-bd58-ec786ef2d3d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:37.996 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[2399e105-2d72-420e-892d-a00b4f1859c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:37.998 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[1889fa56-5a5e-4b81-8354-e9a95df6dacb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:38 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Feb 23 10:57:38 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 13.142s CPU time.
Feb 23 10:57:38 compute-0 systemd-machined[156970]: Machine qemu-3-instance-00000004 terminated.
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.014 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[696bc347-5059-4861-a572-edf09408c129]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.016 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.025 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5f841af2-22e7-4e40-aa74-9d04bc805ad2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 27640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209359, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.033 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e365e8ed-d0e5-4b15-b105-434bded27841]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327577, 'tstamp': 327577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209360, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327579, 'tstamp': 327579}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209360, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.035 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.036 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.039 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.040 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa10ae0ff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.040 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.041 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa10ae0ff-b0, col_values=(('external_ids', {'iface-id': '1f2f61eb-a013-45e9-8854-1868e5df18d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:38.041 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.176 187643 INFO nova.virt.libvirt.driver [-] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Instance destroyed successfully.
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.177 187643 DEBUG nova.objects.instance [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'resources' on Instance uuid a9b337ca-0b37-41f6-bc6c-df32d188e518 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.196 187643 DEBUG nova.virt.libvirt.vif [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:56:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1976330993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1976330993',id=4,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:56:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-cer9rjz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T10:56:28Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=a9b337ca-0b37-41f6-bc6c-df32d188e518,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.197 187643 DEBUG nova.network.os_vif_util [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "address": "fa:16:3e:02:8d:71", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap558b51e6-cc", "ovs_interfaceid": "558b51e6-cccf-4284-b026-65ada3d4aaa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.197 187643 DEBUG nova.network.os_vif_util [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:8d:71,bridge_name='br-int',has_traffic_filtering=True,id=558b51e6-cccf-4284-b026-65ada3d4aaa3,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558b51e6-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.197 187643 DEBUG os_vif [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:8d:71,bridge_name='br-int',has_traffic_filtering=True,id=558b51e6-cccf-4284-b026-65ada3d4aaa3,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558b51e6-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.199 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.199 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap558b51e6-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.200 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.201 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.203 187643 INFO os_vif [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:8d:71,bridge_name='br-int',has_traffic_filtering=True,id=558b51e6-cccf-4284-b026-65ada3d4aaa3,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558b51e6-cc')
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.203 187643 INFO nova.virt.libvirt.driver [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Deleting instance files /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518_del
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.203 187643 INFO nova.virt.libvirt.driver [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Deletion of /var/lib/nova/instances/a9b337ca-0b37-41f6-bc6c-df32d188e518_del complete
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.275 187643 DEBUG nova.virt.libvirt.host [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.276 187643 INFO nova.virt.libvirt.host [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] UEFI support detected
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.278 187643 INFO nova.compute.manager [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.278 187643 DEBUG oslo.service.loopingcall [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.279 187643 DEBUG nova.compute.manager [-] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.279 187643 DEBUG nova.network.neutron [-] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.497 187643 DEBUG nova.compute.manager [req-25d1f81c-cfbf-419c-9448-2d94f2758353 req-71bea4df-34ae-4d21-8193-c9bb86857764 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received event network-vif-unplugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.499 187643 DEBUG oslo_concurrency.lockutils [req-25d1f81c-cfbf-419c-9448-2d94f2758353 req-71bea4df-34ae-4d21-8193-c9bb86857764 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.499 187643 DEBUG oslo_concurrency.lockutils [req-25d1f81c-cfbf-419c-9448-2d94f2758353 req-71bea4df-34ae-4d21-8193-c9bb86857764 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.499 187643 DEBUG oslo_concurrency.lockutils [req-25d1f81c-cfbf-419c-9448-2d94f2758353 req-71bea4df-34ae-4d21-8193-c9bb86857764 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.499 187643 DEBUG nova.compute.manager [req-25d1f81c-cfbf-419c-9448-2d94f2758353 req-71bea4df-34ae-4d21-8193-c9bb86857764 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] No waiting events found dispatching network-vif-unplugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.500 187643 DEBUG nova.compute.manager [req-25d1f81c-cfbf-419c-9448-2d94f2758353 req-71bea4df-34ae-4d21-8193-c9bb86857764 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received event network-vif-unplugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.954 187643 DEBUG nova.network.neutron [-] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:57:38 compute-0 nova_compute[187639]: 2026-02-23 10:57:38.976 187643 INFO nova.compute.manager [-] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Took 0.70 seconds to deallocate network for instance.
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.039 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.040 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.152 187643 DEBUG nova.compute.provider_tree [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.172 187643 DEBUG nova.scheduler.client.report [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.212 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.249 187643 INFO nova.scheduler.client.report [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Deleted allocations for instance a9b337ca-0b37-41f6-bc6c-df32d188e518
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.311 187643 DEBUG oslo_concurrency.lockutils [None req-6d221bbf-6313-4550-a55e-288e24e4ed5c 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.812 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "81523704-1367-466b-8876-682fba2244b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.813 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.813 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "81523704-1367-466b-8876-682fba2244b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.813 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.813 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.814 187643 INFO nova.compute.manager [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Terminating instance
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.815 187643 DEBUG nova.compute.manager [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.835 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:39 compute-0 kernel: tap718140bb-9a (unregistering): left promiscuous mode
Feb 23 10:57:39 compute-0 NetworkManager[57207]: <info>  [1771844259.8421] device (tap718140bb-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 10:57:39 compute-0 ovn_controller[97601]: 2026-02-23T10:57:39Z|00049|binding|INFO|Releasing lport 718140bb-9a89-42fb-8a51-295ef9667227 from this chassis (sb_readonly=0)
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.848 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:39 compute-0 ovn_controller[97601]: 2026-02-23T10:57:39Z|00050|binding|INFO|Setting lport 718140bb-9a89-42fb-8a51-295ef9667227 down in Southbound
Feb 23 10:57:39 compute-0 ovn_controller[97601]: 2026-02-23T10:57:39Z|00051|binding|INFO|Removing iface tap718140bb-9a ovn-installed in OVS
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.850 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.853 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.855 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5e:59 10.100.0.11'], port_security=['fa:16:3e:03:5e:59 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '81523704-1367-466b-8876-682fba2244b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '11', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=718140bb-9a89-42fb-8a51-295ef9667227) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.859 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 718140bb-9a89-42fb-8a51-295ef9667227 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a unbound from our chassis
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.862 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.873 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc91523-875e-483e-8b7d-521c26f2e3f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:39 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 23 10:57:39 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Consumed 1.540s CPU time.
Feb 23 10:57:39 compute-0 systemd-machined[156970]: Machine qemu-4-instance-00000003 terminated.
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.894 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[77d0ed83-c1c9-4d38-aef5-2ecb959703bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.897 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[b7770b5f-63d7-4c44-b023-ce9fbc7763a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.919 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[859d375f-f3d0-4dd7-b857-150e1a7072a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.934 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[cbe8bd8d-26ad-4219-8770-10f50d404b2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 27640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209388, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.950 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec364d6-8c9c-4cfe-98da-2e5c5cd3c9bc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327577, 'tstamp': 327577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209389, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327579, 'tstamp': 327579}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209389, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.951 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.953 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:39 compute-0 nova_compute[187639]: 2026-02-23 10:57:39.957 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.957 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa10ae0ff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.957 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.958 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa10ae0ff-b0, col_values=(('external_ids', {'iface-id': '1f2f61eb-a013-45e9-8854-1868e5df18d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:39.958 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.064 187643 INFO nova.virt.libvirt.driver [-] [instance: 81523704-1367-466b-8876-682fba2244b9] Instance destroyed successfully.
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.065 187643 DEBUG nova.objects.instance [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'resources' on Instance uuid 81523704-1367-466b-8876-682fba2244b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.079 187643 DEBUG nova.virt.libvirt.vif [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-23T10:56:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-937316806',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-937316806',id=3,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:56:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-gs33w1pe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T10:57:29Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=81523704-1367-466b-8876-682fba2244b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "718140bb-9a89-42fb-8a51-295ef9667227", "address": "fa:16:3e:03:5e:59", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap718140bb-9a", "ovs_interfaceid": "718140bb-9a89-42fb-8a51-295ef9667227", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.079 187643 DEBUG nova.network.os_vif_util [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "718140bb-9a89-42fb-8a51-295ef9667227", "address": "fa:16:3e:03:5e:59", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap718140bb-9a", "ovs_interfaceid": "718140bb-9a89-42fb-8a51-295ef9667227", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.080 187643 DEBUG nova.network.os_vif_util [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:5e:59,bridge_name='br-int',has_traffic_filtering=True,id=718140bb-9a89-42fb-8a51-295ef9667227,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap718140bb-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.081 187643 DEBUG os_vif [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5e:59,bridge_name='br-int',has_traffic_filtering=True,id=718140bb-9a89-42fb-8a51-295ef9667227,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap718140bb-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.083 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.083 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap718140bb-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.087 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.089 187643 INFO os_vif [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5e:59,bridge_name='br-int',has_traffic_filtering=True,id=718140bb-9a89-42fb-8a51-295ef9667227,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap718140bb-9a')
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.090 187643 INFO nova.virt.libvirt.driver [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Deleting instance files /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9_del
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.091 187643 INFO nova.virt.libvirt.driver [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Deletion of /var/lib/nova/instances/81523704-1367-466b-8876-682fba2244b9_del complete
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.134 187643 INFO nova.compute.manager [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Took 0.32 seconds to destroy the instance on the hypervisor.
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.135 187643 DEBUG oslo.service.loopingcall [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.135 187643 DEBUG nova.compute.manager [-] [instance: 81523704-1367-466b-8876-682fba2244b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.135 187643 DEBUG nova.network.neutron [-] [instance: 81523704-1367-466b-8876-682fba2244b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.596 187643 DEBUG nova.compute.manager [req-ca1f9901-3677-4c9c-bddd-49fc3f2cb887 req-b2ae9bca-2e4d-48a2-a5dc-244ebae556e1 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received event network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.597 187643 DEBUG oslo_concurrency.lockutils [req-ca1f9901-3677-4c9c-bddd-49fc3f2cb887 req-b2ae9bca-2e4d-48a2-a5dc-244ebae556e1 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.597 187643 DEBUG oslo_concurrency.lockutils [req-ca1f9901-3677-4c9c-bddd-49fc3f2cb887 req-b2ae9bca-2e4d-48a2-a5dc-244ebae556e1 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.597 187643 DEBUG oslo_concurrency.lockutils [req-ca1f9901-3677-4c9c-bddd-49fc3f2cb887 req-b2ae9bca-2e4d-48a2-a5dc-244ebae556e1 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a9b337ca-0b37-41f6-bc6c-df32d188e518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.597 187643 DEBUG nova.compute.manager [req-ca1f9901-3677-4c9c-bddd-49fc3f2cb887 req-b2ae9bca-2e4d-48a2-a5dc-244ebae556e1 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] No waiting events found dispatching network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.598 187643 WARNING nova.compute.manager [req-ca1f9901-3677-4c9c-bddd-49fc3f2cb887 req-b2ae9bca-2e4d-48a2-a5dc-244ebae556e1 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received unexpected event network-vif-plugged-558b51e6-cccf-4284-b026-65ada3d4aaa3 for instance with vm_state deleted and task_state None.
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.598 187643 DEBUG nova.compute.manager [req-ca1f9901-3677-4c9c-bddd-49fc3f2cb887 req-b2ae9bca-2e4d-48a2-a5dc-244ebae556e1 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Received event network-vif-deleted-558b51e6-cccf-4284-b026-65ada3d4aaa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.711 187643 DEBUG nova.compute.manager [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Received event network-vif-unplugged-718140bb-9a89-42fb-8a51-295ef9667227 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.712 187643 DEBUG oslo_concurrency.lockutils [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "81523704-1367-466b-8876-682fba2244b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.712 187643 DEBUG oslo_concurrency.lockutils [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.712 187643 DEBUG oslo_concurrency.lockutils [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.713 187643 DEBUG nova.compute.manager [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] No waiting events found dispatching network-vif-unplugged-718140bb-9a89-42fb-8a51-295ef9667227 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.713 187643 DEBUG nova.compute.manager [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Received event network-vif-unplugged-718140bb-9a89-42fb-8a51-295ef9667227 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.713 187643 DEBUG nova.compute.manager [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Received event network-vif-plugged-718140bb-9a89-42fb-8a51-295ef9667227 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.714 187643 DEBUG oslo_concurrency.lockutils [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "81523704-1367-466b-8876-682fba2244b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.714 187643 DEBUG oslo_concurrency.lockutils [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.714 187643 DEBUG oslo_concurrency.lockutils [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.715 187643 DEBUG nova.compute.manager [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] No waiting events found dispatching network-vif-plugged-718140bb-9a89-42fb-8a51-295ef9667227 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.715 187643 WARNING nova.compute.manager [req-c0969236-ec08-4758-9c5d-363f38590ee9 req-a6edaf67-675f-4b0c-89d9-546da1ef3795 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Received unexpected event network-vif-plugged-718140bb-9a89-42fb-8a51-295ef9667227 for instance with vm_state active and task_state deleting.
Feb 23 10:57:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:40.865 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:57:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:40.866 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.901 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.921 187643 DEBUG nova.network.neutron [-] [instance: 81523704-1367-466b-8876-682fba2244b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.938 187643 INFO nova.compute.manager [-] [instance: 81523704-1367-466b-8876-682fba2244b9] Took 0.80 seconds to deallocate network for instance.
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.985 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.985 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:40 compute-0 nova_compute[187639]: 2026-02-23 10:57:40.990 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:41 compute-0 nova_compute[187639]: 2026-02-23 10:57:41.020 187643 INFO nova.scheduler.client.report [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Deleted allocations for instance 81523704-1367-466b-8876-682fba2244b9
Feb 23 10:57:41 compute-0 nova_compute[187639]: 2026-02-23 10:57:41.097 187643 DEBUG oslo_concurrency.lockutils [None req-58ef8d11-c14b-4a72-aed1-15b611c99d4e 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "81523704-1367-466b-8876-682fba2244b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:41 compute-0 podman[209408]: 2026-02-23 10:57:41.856164854 +0000 UTC m=+0.055857863 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.533 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.534 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.534 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.534 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.534 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.535 187643 INFO nova.compute.manager [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Terminating instance
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.536 187643 DEBUG nova.compute.manager [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 10:57:42 compute-0 kernel: tapb6fd14a3-d4 (unregistering): left promiscuous mode
Feb 23 10:57:42 compute-0 NetworkManager[57207]: <info>  [1771844262.5622] device (tapb6fd14a3-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.565 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 ovn_controller[97601]: 2026-02-23T10:57:42Z|00052|binding|INFO|Releasing lport b6fd14a3-d411-4ffa-92c3-a38e98e7e599 from this chassis (sb_readonly=0)
Feb 23 10:57:42 compute-0 ovn_controller[97601]: 2026-02-23T10:57:42Z|00053|binding|INFO|Setting lport b6fd14a3-d411-4ffa-92c3-a38e98e7e599 down in Southbound
Feb 23 10:57:42 compute-0 ovn_controller[97601]: 2026-02-23T10:57:42Z|00054|binding|INFO|Removing iface tapb6fd14a3-d4 ovn-installed in OVS
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.567 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.573 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.574 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:12:3e 10.100.0.12'], port_security=['fa:16:3e:61:12:3e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ff2f3092-5677-4653-b466-6507edb18e01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b6fd14a3-d411-4ffa-92c3-a38e98e7e599) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.575 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b6fd14a3-d411-4ffa-92c3-a38e98e7e599 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a unbound from our chassis
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.576 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.586 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0d2adf75-e4e1-4a60-bf45-75d72530850c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.601 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f68e6d-dad2-49f6-b5db-3c59c6f2d210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.603 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[fc848a25-d89e-412d-8ec9-4e3139f5adcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.615 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d8d604-f315-4629-b924-5e3a0fa0382d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:42 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb 23 10:57:42 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 16.804s CPU time.
Feb 23 10:57:42 compute-0 systemd-machined[156970]: Machine qemu-1-instance-00000002 terminated.
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.631 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef18b6d-421b-4f04-84b4-ae4fa0b52b93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa10ae0ff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:35:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327570, 'reachable_time': 27640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209443, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.639 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[365f6a56-4d96-47ca-8558-c7c05c84a186]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327577, 'tstamp': 327577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209444, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa10ae0ff-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327579, 'tstamp': 327579}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209444, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.640 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.641 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.644 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.645 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa10ae0ff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.645 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.645 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa10ae0ff-b0, col_values=(('external_ids', {'iface-id': '1f2f61eb-a013-45e9-8854-1868e5df18d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:42.645 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.751 187643 DEBUG nova.compute.manager [req-38725699-0036-4efd-a95e-c47bc31802a7 req-d64dd9cc-c6ca-453c-b0c7-e27789a16ccb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 81523704-1367-466b-8876-682fba2244b9] Received event network-vif-deleted-718140bb-9a89-42fb-8a51-295ef9667227 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.779 187643 INFO nova.virt.libvirt.driver [-] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Instance destroyed successfully.
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.780 187643 DEBUG nova.objects.instance [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'resources' on Instance uuid ff2f3092-5677-4653-b466-6507edb18e01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.796 187643 DEBUG nova.virt.libvirt.vif [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:55:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1848147015',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1848147015',id=2,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:55:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-gpmzgv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T10:55:29Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=ff2f3092-5677-4653-b466-6507edb18e01,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.796 187643 DEBUG nova.network.os_vif_util [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "address": "fa:16:3e:61:12:3e", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6fd14a3-d4", "ovs_interfaceid": "b6fd14a3-d411-4ffa-92c3-a38e98e7e599", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.797 187643 DEBUG nova.network.os_vif_util [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:61:12:3e,bridge_name='br-int',has_traffic_filtering=True,id=b6fd14a3-d411-4ffa-92c3-a38e98e7e599,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6fd14a3-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.797 187643 DEBUG os_vif [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:12:3e,bridge_name='br-int',has_traffic_filtering=True,id=b6fd14a3-d411-4ffa-92c3-a38e98e7e599,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6fd14a3-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.799 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.799 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6fd14a3-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.800 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.801 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.803 187643 INFO os_vif [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:12:3e,bridge_name='br-int',has_traffic_filtering=True,id=b6fd14a3-d411-4ffa-92c3-a38e98e7e599,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6fd14a3-d4')
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.803 187643 INFO nova.virt.libvirt.driver [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Deleting instance files /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01_del
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.803 187643 INFO nova.virt.libvirt.driver [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Deletion of /var/lib/nova/instances/ff2f3092-5677-4653-b466-6507edb18e01_del complete
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.874 187643 INFO nova.compute.manager [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.874 187643 DEBUG oslo.service.loopingcall [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.875 187643 DEBUG nova.compute.manager [-] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 10:57:42 compute-0 nova_compute[187639]: 2026-02-23 10:57:42.875 187643 DEBUG nova.network.neutron [-] [instance: ff2f3092-5677-4653-b466-6507edb18e01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 10:57:43 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:43.869 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.221 187643 DEBUG nova.network.neutron [-] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.247 187643 INFO nova.compute.manager [-] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Took 1.37 seconds to deallocate network for instance.
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.303 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.303 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.376 187643 DEBUG nova.compute.provider_tree [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.392 187643 DEBUG nova.scheduler.client.report [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.416 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.459 187643 INFO nova.scheduler.client.report [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Deleted allocations for instance ff2f3092-5677-4653-b466-6507edb18e01
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.539 187643 DEBUG oslo_concurrency.lockutils [None req-bc9a0d60-8dba-4a6c-a5fe-782625b04a42 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.839 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.854 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.854 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.854 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.855 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.855 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.856 187643 INFO nova.compute.manager [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Terminating instance
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.857 187643 DEBUG nova.compute.manager [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.859 187643 DEBUG nova.compute.manager [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received event network-vif-unplugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.859 187643 DEBUG oslo_concurrency.lockutils [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.859 187643 DEBUG oslo_concurrency.lockutils [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.859 187643 DEBUG oslo_concurrency.lockutils [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.859 187643 DEBUG nova.compute.manager [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] No waiting events found dispatching network-vif-unplugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.860 187643 WARNING nova.compute.manager [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received unexpected event network-vif-unplugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 for instance with vm_state deleted and task_state None.
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.860 187643 DEBUG nova.compute.manager [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received event network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.860 187643 DEBUG oslo_concurrency.lockutils [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ff2f3092-5677-4653-b466-6507edb18e01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.860 187643 DEBUG oslo_concurrency.lockutils [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.860 187643 DEBUG oslo_concurrency.lockutils [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ff2f3092-5677-4653-b466-6507edb18e01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.860 187643 DEBUG nova.compute.manager [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] No waiting events found dispatching network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.861 187643 WARNING nova.compute.manager [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received unexpected event network-vif-plugged-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 for instance with vm_state deleted and task_state None.
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.861 187643 DEBUG nova.compute.manager [req-4fd8135a-1162-4e36-b7d4-3a364a8e47ef req-b620ecb9-802f-4846-94aa-58455c5a223f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Received event network-vif-deleted-b6fd14a3-d411-4ffa-92c3-a38e98e7e599 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:44 compute-0 kernel: tap3335f10d-83 (unregistering): left promiscuous mode
Feb 23 10:57:44 compute-0 NetworkManager[57207]: <info>  [1771844264.8821] device (tap3335f10d-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 10:57:44 compute-0 ovn_controller[97601]: 2026-02-23T10:57:44Z|00055|binding|INFO|Releasing lport 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 from this chassis (sb_readonly=0)
Feb 23 10:57:44 compute-0 ovn_controller[97601]: 2026-02-23T10:57:44Z|00056|binding|INFO|Setting lport 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 down in Southbound
Feb 23 10:57:44 compute-0 ovn_controller[97601]: 2026-02-23T10:57:44Z|00057|binding|INFO|Removing iface tap3335f10d-83 ovn-installed in OVS
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.937 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.938 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:44 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:44.942 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:97:04 10.100.0.10'], port_security=['fa:16:3e:cf:97:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce5f9655-093d-401a-8279-2affb3f9ea4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b2fdb094fae4998b67f82aa76acda6a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '236e3228-8f89-4f1a-aff3-72992982b1b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7be34c7-3c68-4890-ae34-8249d8051594, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:57:44 compute-0 nova_compute[187639]: 2026-02-23 10:57:44.943 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:44 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:44.943 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 in datapath a10ae0ff-ba31-43e9-bf1d-9df93406b21a unbound from our chassis
Feb 23 10:57:44 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:44.945 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a10ae0ff-ba31-43e9-bf1d-9df93406b21a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 10:57:44 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:44.945 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[07464dc6-291a-4241-9eaa-70be1aa45f5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:44 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:44.946 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a namespace which is not needed anymore
Feb 23 10:57:44 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Deactivated successfully.
Feb 23 10:57:44 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Consumed 15.393s CPU time.
Feb 23 10:57:44 compute-0 systemd-machined[156970]: Machine qemu-2-instance-00000001 terminated.
Feb 23 10:57:45 compute-0 neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a[208581]: [NOTICE]   (208585) : haproxy version is 2.8.14-c23fe91
Feb 23 10:57:45 compute-0 neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a[208581]: [NOTICE]   (208585) : path to executable is /usr/sbin/haproxy
Feb 23 10:57:45 compute-0 neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a[208581]: [WARNING]  (208585) : Exiting Master process...
Feb 23 10:57:45 compute-0 neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a[208581]: [ALERT]    (208585) : Current worker (208587) exited with code 143 (Terminated)
Feb 23 10:57:45 compute-0 neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a[208581]: [WARNING]  (208585) : All workers exited. Exiting... (0)
Feb 23 10:57:45 compute-0 systemd[1]: libpod-42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046.scope: Deactivated successfully.
Feb 23 10:57:45 compute-0 podman[209487]: 2026-02-23 10:57:45.043880459 +0000 UTC m=+0.040356377 container died 42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 10:57:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046-userdata-shm.mount: Deactivated successfully.
Feb 23 10:57:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-5cd0b9747d6c24942de8f03e484fe68b8c1d6f449d88acd963564784b5a72458-merged.mount: Deactivated successfully.
Feb 23 10:57:45 compute-0 podman[209487]: 2026-02-23 10:57:45.067321539 +0000 UTC m=+0.063797457 container cleanup 42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 10:57:45 compute-0 NetworkManager[57207]: <info>  [1771844265.0726] manager: (tap3335f10d-83): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Feb 23 10:57:45 compute-0 systemd[1]: libpod-conmon-42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046.scope: Deactivated successfully.
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.099 187643 INFO nova.virt.libvirt.driver [-] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Instance destroyed successfully.
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.100 187643 DEBUG nova.objects.instance [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lazy-loading 'resources' on Instance uuid ce5f9655-093d-401a-8279-2affb3f9ea4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.115 187643 DEBUG nova.virt.libvirt.vif [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:55:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1457314767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1457314767',id=1,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:55:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8b2fdb094fae4998b67f82aa76acda6a',ramdisk_id='',reservation_id='r-8g7unckd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1766821287',owner_user_name='tempest-TestExecuteActionsViaActuator-1766821287-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T10:56:00Z,user_data=None,user_id='38b4598a0d9649aaa7ba0cfac82e4414',uuid=ce5f9655-093d-401a-8279-2affb3f9ea4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.115 187643 DEBUG nova.network.os_vif_util [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converting VIF {"id": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "address": "fa:16:3e:cf:97:04", "network": {"id": "a10ae0ff-ba31-43e9-bf1d-9df93406b21a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-610790437-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b2fdb094fae4998b67f82aa76acda6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3335f10d-83", "ovs_interfaceid": "3335f10d-83b3-44e3-8ba7-ddb0cda1ff98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.115 187643 DEBUG nova.network.os_vif_util [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:97:04,bridge_name='br-int',has_traffic_filtering=True,id=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3335f10d-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.116 187643 DEBUG os_vif [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:97:04,bridge_name='br-int',has_traffic_filtering=True,id=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3335f10d-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.117 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.117 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3335f10d-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.118 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.119 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:45 compute-0 podman[209523]: 2026-02-23 10:57:45.120786616 +0000 UTC m=+0.037505899 container remove 42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.121 187643 INFO os_vif [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:97:04,bridge_name='br-int',has_traffic_filtering=True,id=3335f10d-83b3-44e3-8ba7-ddb0cda1ff98,network=Network(a10ae0ff-ba31-43e9-bf1d-9df93406b21a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3335f10d-83')
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.122 187643 INFO nova.virt.libvirt.driver [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Deleting instance files /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c_del
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.125 187643 INFO nova.virt.libvirt.driver [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Deletion of /var/lib/nova/instances/ce5f9655-093d-401a-8279-2affb3f9ea4c_del complete
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.125 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b7177988-3728-4b32-92b2-df82000680a7]: (4, ('Mon Feb 23 10:57:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a (42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046)\n42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046\nMon Feb 23 10:57:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a (42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046)\n42f387ccf8ae7b03e03ac4f5d5aeb1af7c27aa78b17d37dcd46b842d16d04046\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.126 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1658b00a-5c52-46e9-b145-c40694120ca6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.127 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa10ae0ff-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.128 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:45 compute-0 kernel: tapa10ae0ff-b0: left promiscuous mode
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.131 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.135 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.135 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb479cc-23bc-46c9-b2df-cf97364d1f1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.158 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[29353081-f755-41f2-b48f-b7303cd5d52d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.159 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d92264-90f7-4a9d-bb27-ec9613881e5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.170 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d76529-20e0-4ccf-9bee-b74ca22587da]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327562, 'reachable_time': 19507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209552, 'error': None, 'target': 'ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.176 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a10ae0ff-ba31-43e9-bf1d-9df93406b21a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 10:57:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:57:45.177 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[b07b08c4-5721-4f5c-8d0a-0005d2d22f02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:57:45 compute-0 systemd[1]: run-netns-ovnmeta\x2da10ae0ff\x2dba31\x2d43e9\x2dbf1d\x2d9df93406b21a.mount: Deactivated successfully.
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.211 187643 INFO nova.compute.manager [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.213 187643 DEBUG oslo.service.loopingcall [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.213 187643 DEBUG nova.compute.manager [-] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.213 187643 DEBUG nova.network.neutron [-] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.761 187643 DEBUG nova.network.neutron [-] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.794 187643 INFO nova.compute.manager [-] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Took 0.58 seconds to deallocate network for instance.
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.856 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.857 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.894 187643 DEBUG nova.compute.provider_tree [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.912 187643 DEBUG nova.scheduler.client.report [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.933 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:45 compute-0 nova_compute[187639]: 2026-02-23 10:57:45.957 187643 INFO nova.scheduler.client.report [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Deleted allocations for instance ce5f9655-093d-401a-8279-2affb3f9ea4c
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.025 187643 DEBUG oslo_concurrency.lockutils [None req-3536d0de-7624-46f2-817a-badaff9894e3 38b4598a0d9649aaa7ba0cfac82e4414 8b2fdb094fae4998b67f82aa76acda6a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.966 187643 DEBUG nova.compute.manager [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-vif-unplugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.966 187643 DEBUG oslo_concurrency.lockutils [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.967 187643 DEBUG oslo_concurrency.lockutils [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.967 187643 DEBUG oslo_concurrency.lockutils [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.968 187643 DEBUG nova.compute.manager [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] No waiting events found dispatching network-vif-unplugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.968 187643 WARNING nova.compute.manager [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received unexpected event network-vif-unplugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 for instance with vm_state deleted and task_state None.
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.969 187643 DEBUG nova.compute.manager [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.969 187643 DEBUG oslo_concurrency.lockutils [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.970 187643 DEBUG oslo_concurrency.lockutils [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.970 187643 DEBUG oslo_concurrency.lockutils [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "ce5f9655-093d-401a-8279-2affb3f9ea4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.971 187643 DEBUG nova.compute.manager [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] No waiting events found dispatching network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.971 187643 WARNING nova.compute.manager [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received unexpected event network-vif-plugged-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 for instance with vm_state deleted and task_state None.
Feb 23 10:57:46 compute-0 nova_compute[187639]: 2026-02-23 10:57:46.972 187643 DEBUG nova.compute.manager [req-3807bd9f-5c4c-4d69-bc6d-9669b9adf135 req-b0ad089c-b778-4803-b33c-72fd345e4917 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Received event network-vif-deleted-3335f10d-83b3-44e3-8ba7-ddb0cda1ff98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:57:47 compute-0 nova_compute[187639]: 2026-02-23 10:57:47.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:47 compute-0 nova_compute[187639]: 2026-02-23 10:57:47.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 10:57:47 compute-0 nova_compute[187639]: 2026-02-23 10:57:47.712 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 10:57:48 compute-0 nova_compute[187639]: 2026-02-23 10:57:48.694 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:48 compute-0 nova_compute[187639]: 2026-02-23 10:57:48.694 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:49 compute-0 nova_compute[187639]: 2026-02-23 10:57:49.713 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:49 compute-0 nova_compute[187639]: 2026-02-23 10:57:49.839 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:50 compute-0 nova_compute[187639]: 2026-02-23 10:57:50.119 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:50 compute-0 nova_compute[187639]: 2026-02-23 10:57:50.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:50 compute-0 nova_compute[187639]: 2026-02-23 10:57:50.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:57:50 compute-0 nova_compute[187639]: 2026-02-23 10:57:50.731 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:51 compute-0 nova_compute[187639]: 2026-02-23 10:57:51.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:51 compute-0 nova_compute[187639]: 2026-02-23 10:57:51.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:51 compute-0 nova_compute[187639]: 2026-02-23 10:57:51.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:51 compute-0 nova_compute[187639]: 2026-02-23 10:57:51.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:51 compute-0 nova_compute[187639]: 2026-02-23 10:57:51.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.176 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844258.1754885, a9b337ca-0b37-41f6-bc6c-df32d188e518 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.177 187643 INFO nova.compute.manager [-] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] VM Stopped (Lifecycle Event)
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.199 187643 DEBUG nova.compute.manager [None req-a5c53dc0-cad3-41d7-9b99-11093ec75d5f - - - - - -] [instance: a9b337ca-0b37-41f6-bc6c-df32d188e518] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.709 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.710 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.710 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.731 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.732 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.762 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.762 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.763 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.763 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.886 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.888 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.20632553100586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.888 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:57:53 compute-0 nova_compute[187639]: 2026-02-23 10:57:53.888 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.079 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.080 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.182 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.202 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.237 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.237 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.682 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:54 compute-0 podman[209555]: 2026-02-23 10:57:54.840040082 +0000 UTC m=+0.047411416 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:57:54 compute-0 nova_compute[187639]: 2026-02-23 10:57:54.840 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:55 compute-0 nova_compute[187639]: 2026-02-23 10:57:55.062 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844260.061253, 81523704-1367-466b-8876-682fba2244b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:57:55 compute-0 nova_compute[187639]: 2026-02-23 10:57:55.063 187643 INFO nova.compute.manager [-] [instance: 81523704-1367-466b-8876-682fba2244b9] VM Stopped (Lifecycle Event)
Feb 23 10:57:55 compute-0 nova_compute[187639]: 2026-02-23 10:57:55.085 187643 DEBUG nova.compute.manager [None req-53730267-e149-4740-a9f1-db071ff01831 - - - - - -] [instance: 81523704-1367-466b-8876-682fba2244b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:57:55 compute-0 nova_compute[187639]: 2026-02-23 10:57:55.120 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:57:55 compute-0 nova_compute[187639]: 2026-02-23 10:57:55.196 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:57 compute-0 nova_compute[187639]: 2026-02-23 10:57:57.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:57:57 compute-0 nova_compute[187639]: 2026-02-23 10:57:57.778 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844262.777984, ff2f3092-5677-4653-b466-6507edb18e01 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:57:57 compute-0 nova_compute[187639]: 2026-02-23 10:57:57.778 187643 INFO nova.compute.manager [-] [instance: ff2f3092-5677-4653-b466-6507edb18e01] VM Stopped (Lifecycle Event)
Feb 23 10:57:57 compute-0 nova_compute[187639]: 2026-02-23 10:57:57.803 187643 DEBUG nova.compute.manager [None req-dc106f7d-276f-4abc-8fb0-32b67627c64c - - - - - -] [instance: ff2f3092-5677-4653-b466-6507edb18e01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:57:59 compute-0 podman[197002]: time="2026-02-23T10:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:57:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:57:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Feb 23 10:57:59 compute-0 nova_compute[187639]: 2026-02-23 10:57:59.843 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:00 compute-0 nova_compute[187639]: 2026-02-23 10:58:00.098 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844265.0972598, ce5f9655-093d-401a-8279-2affb3f9ea4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:58:00 compute-0 nova_compute[187639]: 2026-02-23 10:58:00.098 187643 INFO nova.compute.manager [-] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] VM Stopped (Lifecycle Event)
Feb 23 10:58:00 compute-0 nova_compute[187639]: 2026-02-23 10:58:00.116 187643 DEBUG nova.compute.manager [None req-b20e7d4a-d93c-4857-8c5a-f8546e4fba22 - - - - - -] [instance: ce5f9655-093d-401a-8279-2affb3f9ea4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:58:00 compute-0 nova_compute[187639]: 2026-02-23 10:58:00.138 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:01 compute-0 openstack_network_exporter[199919]: ERROR   10:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:58:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:58:01 compute-0 openstack_network_exporter[199919]: ERROR   10:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:58:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:58:02 compute-0 podman[209580]: 2026-02-23 10:58:02.90427557 +0000 UTC m=+0.101989384 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:58:04 compute-0 nova_compute[187639]: 2026-02-23 10:58:04.845 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:05 compute-0 nova_compute[187639]: 2026-02-23 10:58:05.164 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:05 compute-0 sshd-session[209599]: Invalid user admin from 143.198.30.3 port 35512
Feb 23 10:58:05 compute-0 sshd-session[209599]: Connection closed by invalid user admin 143.198.30.3 port 35512 [preauth]
Feb 23 10:58:08 compute-0 podman[209601]: 2026-02-23 10:58:08.927559053 +0000 UTC m=+0.132429182 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:58:09 compute-0 nova_compute[187639]: 2026-02-23 10:58:09.847 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:10 compute-0 nova_compute[187639]: 2026-02-23 10:58:10.207 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:10 compute-0 sshd-session[209630]: Connection closed by authenticating user root 165.227.79.48 port 59354 [preauth]
Feb 23 10:58:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:12.639 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:12.640 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:12.640 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:12 compute-0 podman[209632]: 2026-02-23 10:58:12.897310095 +0000 UTC m=+0.080582377 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, release=1770267347, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Feb 23 10:58:14 compute-0 nova_compute[187639]: 2026-02-23 10:58:14.849 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:15 compute-0 nova_compute[187639]: 2026-02-23 10:58:15.209 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:19 compute-0 nova_compute[187639]: 2026-02-23 10:58:19.851 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:20 compute-0 nova_compute[187639]: 2026-02-23 10:58:20.259 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:24 compute-0 nova_compute[187639]: 2026-02-23 10:58:24.879 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:25 compute-0 nova_compute[187639]: 2026-02-23 10:58:25.261 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:25 compute-0 podman[209654]: 2026-02-23 10:58:25.854017097 +0000 UTC m=+0.055591440 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:58:29 compute-0 podman[197002]: time="2026-02-23T10:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:58:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:58:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Feb 23 10:58:29 compute-0 nova_compute[187639]: 2026-02-23 10:58:29.926 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:30 compute-0 nova_compute[187639]: 2026-02-23 10:58:30.262 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:30 compute-0 ovn_controller[97601]: 2026-02-23T10:58:30Z|00058|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Feb 23 10:58:31 compute-0 openstack_network_exporter[199919]: ERROR   10:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:58:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:58:31 compute-0 openstack_network_exporter[199919]: ERROR   10:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:58:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:58:33 compute-0 podman[209679]: 2026-02-23 10:58:33.830317368 +0000 UTC m=+0.038370495 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 10:58:34 compute-0 nova_compute[187639]: 2026-02-23 10:58:34.972 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:35 compute-0 nova_compute[187639]: 2026-02-23 10:58:35.263 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:39 compute-0 sshd-session[209699]: Invalid user admin from 143.198.30.3 port 42360
Feb 23 10:58:39 compute-0 sshd-session[209699]: Connection closed by invalid user admin 143.198.30.3 port 42360 [preauth]
Feb 23 10:58:39 compute-0 podman[209701]: 2026-02-23 10:58:39.52303236 +0000 UTC m=+0.113195752 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 10:58:39 compute-0 nova_compute[187639]: 2026-02-23 10:58:39.975 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:40 compute-0 nova_compute[187639]: 2026-02-23 10:58:40.265 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.504 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.505 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.525 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.694 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.694 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.699 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.699 187643 INFO nova.compute.claims [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Claim successful on node compute-0.ctlplane.example.com
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.843 187643 DEBUG nova.compute.provider_tree [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.860 187643 DEBUG nova.scheduler.client.report [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.884 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.885 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.928 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.928 187643 DEBUG nova.network.neutron [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.945 187643 INFO nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 10:58:41 compute-0 nova_compute[187639]: 2026-02-23 10:58:41.965 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.040 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.041 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.042 187643 INFO nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Creating image(s)
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.042 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquiring lock "/var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.043 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "/var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.043 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "/var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.058 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.134 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.136 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.137 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.161 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.220 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.222 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.243 187643 DEBUG nova.policy [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '845a17801d704603adef909e3f49b086', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '737c024c3cf24bc7b040b295c4f6eae9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.259 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.260 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.261 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.338 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.339 187643 DEBUG nova.virt.disk.api [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Checking if we can resize image /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.339 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.401 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.402 187643 DEBUG nova.virt.disk.api [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Cannot resize image /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.403 187643 DEBUG nova.objects.instance [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lazy-loading 'migration_context' on Instance uuid c11301ca-f35c-44f3-a976-afb41a5e66c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.421 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.422 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Ensure instance console log exists: /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.422 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.422 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.423 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.717 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:42.719 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:58:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:42.720 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:58:42 compute-0 nova_compute[187639]: 2026-02-23 10:58:42.844 187643 DEBUG nova.network.neutron [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Successfully created port: ad14dae8-a715-40c6-bcdd-1f3de3f228fd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 10:58:43 compute-0 podman[209742]: 2026-02-23 10:58:43.884566322 +0000 UTC m=+0.082509431 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, config_id=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Feb 23 10:58:45 compute-0 nova_compute[187639]: 2026-02-23 10:58:45.020 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:45 compute-0 nova_compute[187639]: 2026-02-23 10:58:45.266 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.062 187643 DEBUG nova.network.neutron [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Successfully updated port: ad14dae8-a715-40c6-bcdd-1f3de3f228fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.084 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquiring lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.084 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquired lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.084 187643 DEBUG nova.network.neutron [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.161 187643 DEBUG nova.compute.manager [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-changed-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.161 187643 DEBUG nova.compute.manager [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Refreshing instance network info cache due to event network-changed-ad14dae8-a715-40c6-bcdd-1f3de3f228fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.162 187643 DEBUG oslo_concurrency.lockutils [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:58:46 compute-0 nova_compute[187639]: 2026-02-23 10:58:46.194 187643 DEBUG nova.network.neutron [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 10:58:49 compute-0 nova_compute[187639]: 2026-02-23 10:58:49.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:50 compute-0 nova_compute[187639]: 2026-02-23 10:58:50.081 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:50 compute-0 nova_compute[187639]: 2026-02-23 10:58:50.267 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.166 187643 DEBUG nova.network.neutron [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updating instance_info_cache with network_info: [{"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.187 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Releasing lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.188 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Instance network_info: |[{"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.188 187643 DEBUG oslo_concurrency.lockutils [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.189 187643 DEBUG nova.network.neutron [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Refreshing network info cache for port ad14dae8-a715-40c6-bcdd-1f3de3f228fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.192 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Start _get_guest_xml network_info=[{"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.197 187643 WARNING nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.202 187643 DEBUG nova.virt.libvirt.host [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.203 187643 DEBUG nova.virt.libvirt.host [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.210 187643 DEBUG nova.virt.libvirt.host [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.211 187643 DEBUG nova.virt.libvirt.host [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.212 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.213 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.213 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.214 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.214 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.214 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.215 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.215 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.215 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.215 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.216 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.216 187643 DEBUG nova.virt.hardware [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.220 187643 DEBUG nova.virt.libvirt.vif [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T10:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1348569289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1348569289',id=8,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='737c024c3cf24bc7b040b295c4f6eae9',ramdisk_id='',reservation_id='r-sndy05jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1128369139',owner_user_name='tempest-TestExecuteBasicStrategy-1128369139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:58:41Z,user_data=None,user_id='845a17801d704603adef909e3f49b086',uuid=c11301ca-f35c-44f3-a976-afb41a5e66c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.220 187643 DEBUG nova.network.os_vif_util [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Converting VIF {"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.221 187643 DEBUG nova.network.os_vif_util [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.222 187643 DEBUG nova.objects.instance [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lazy-loading 'pci_devices' on Instance uuid c11301ca-f35c-44f3-a976-afb41a5e66c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.238 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] End _get_guest_xml xml=<domain type="kvm">
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <uuid>c11301ca-f35c-44f3-a976-afb41a5e66c9</uuid>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <name>instance-00000008</name>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <metadata>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1348569289</nova:name>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 10:58:51</nova:creationTime>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:user uuid="845a17801d704603adef909e3f49b086">tempest-TestExecuteBasicStrategy-1128369139-project-member</nova:user>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:project uuid="737c024c3cf24bc7b040b295c4f6eae9">tempest-TestExecuteBasicStrategy-1128369139</nova:project>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         <nova:port uuid="ad14dae8-a715-40c6-bcdd-1f3de3f228fd">
Feb 23 10:58:51 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   </metadata>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <system>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <entry name="serial">c11301ca-f35c-44f3-a976-afb41a5e66c9</entry>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <entry name="uuid">c11301ca-f35c-44f3-a976-afb41a5e66c9</entry>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </system>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <os>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   </os>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <features>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <apic/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   </features>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   </clock>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   </cpu>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   <devices>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk.config"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </disk>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:be:87:df"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <target dev="tapad14dae8-a7"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </interface>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/console.log" append="off"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </serial>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <video>
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </video>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </rng>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 10:58:51 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 10:58:51 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 10:58:51 compute-0 nova_compute[187639]:   </devices>
Feb 23 10:58:51 compute-0 nova_compute[187639]: </domain>
Feb 23 10:58:51 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.239 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Preparing to wait for external event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.239 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.239 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.240 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.240 187643 DEBUG nova.virt.libvirt.vif [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T10:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1348569289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1348569289',id=8,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='737c024c3cf24bc7b040b295c4f6eae9',ramdisk_id='',reservation_id='r-sndy05jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1128369139',owner_user_name='tempest-TestExecuteBasicStrategy-1128369139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T10:58:41Z,user_data=None,user_id='845a17801d704603adef909e3f49b086',uuid=c11301ca-f35c-44f3-a976-afb41a5e66c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.241 187643 DEBUG nova.network.os_vif_util [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Converting VIF {"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.242 187643 DEBUG nova.network.os_vif_util [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.242 187643 DEBUG os_vif [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.243 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.243 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.243 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.246 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.246 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad14dae8-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.247 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapad14dae8-a7, col_values=(('external_ids', {'iface-id': 'ad14dae8-a715-40c6-bcdd-1f3de3f228fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:87:df', 'vm-uuid': 'c11301ca-f35c-44f3-a976-afb41a5e66c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.248 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:51 compute-0 NetworkManager[57207]: <info>  [1771844331.2495] manager: (tapad14dae8-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.251 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.257 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.258 187643 INFO os_vif [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7')
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.301 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.301 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.301 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] No VIF found with MAC fa:16:3e:be:87:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.302 187643 INFO nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Using config drive
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.861 187643 INFO nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Creating config drive at /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk.config
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.866 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpjgbrqjdd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:51 compute-0 nova_compute[187639]: 2026-02-23 10:58:51.983 187643 DEBUG oslo_concurrency.processutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpjgbrqjdd" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:52 compute-0 kernel: tapad14dae8-a7: entered promiscuous mode
Feb 23 10:58:52 compute-0 NetworkManager[57207]: <info>  [1771844332.0460] manager: (tapad14dae8-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb 23 10:58:52 compute-0 ovn_controller[97601]: 2026-02-23T10:58:52Z|00059|binding|INFO|Claiming lport ad14dae8-a715-40c6-bcdd-1f3de3f228fd for this chassis.
Feb 23 10:58:52 compute-0 ovn_controller[97601]: 2026-02-23T10:58:52Z|00060|binding|INFO|ad14dae8-a715-40c6-bcdd-1f3de3f228fd: Claiming fa:16:3e:be:87:df 10.100.0.5
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.046 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.050 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.053 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.068 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:87:df 10.100.0.5'], port_security=['fa:16:3e:be:87:df 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c11301ca-f35c-44f3-a976-afb41a5e66c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '737c024c3cf24bc7b040b295c4f6eae9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7200f997-db9c-488a-8ca5-9f959124bf4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec51f24c-5ac4-4e70-8d9a-50764c2b9502, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=ad14dae8-a715-40c6-bcdd-1f3de3f228fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.069 106968 INFO neutron.agent.ovn.metadata.agent [-] Port ad14dae8-a715-40c6-bcdd-1f3de3f228fd in datapath b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf bound to our chassis
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.071 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf
Feb 23 10:58:52 compute-0 systemd-machined[156970]: New machine qemu-5-instance-00000008.
Feb 23 10:58:52 compute-0 ovn_controller[97601]: 2026-02-23T10:58:52Z|00061|binding|INFO|Setting lport ad14dae8-a715-40c6-bcdd-1f3de3f228fd ovn-installed in OVS
Feb 23 10:58:52 compute-0 ovn_controller[97601]: 2026-02-23T10:58:52Z|00062|binding|INFO|Setting lport ad14dae8-a715-40c6-bcdd-1f3de3f228fd up in Southbound
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.079 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[31e67688-918c-4188-ac58-9ccdfec07140]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.080 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb32ffd3e-b1 in ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.080 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.081 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb32ffd3e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.082 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[35b13480-b8af-4139-a315-b9f7fc429666]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.082 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6126d81f-ba70-4f1e-94ad-e6f2419904d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.090 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[779a306f-7af1-498f-b0cb-cd21d2abfe84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 systemd-udevd[209785]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.099 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[806df5db-039a-43b3-b93c-980491465931]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 NetworkManager[57207]: <info>  [1771844332.1079] device (tapad14dae8-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 10:58:52 compute-0 NetworkManager[57207]: <info>  [1771844332.1084] device (tapad14dae8-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.118 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[cc97cba0-4aa0-4ef5-8206-f67995974b52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 systemd-udevd[209788]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:58:52 compute-0 NetworkManager[57207]: <info>  [1771844332.1221] manager: (tapb32ffd3e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.121 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb949f8-95d0-4e27-b334-101e917e8fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.145 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac40d7d-76ee-4b55-ba01-cde2ab1bd8d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.148 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[dad75413-934e-42f7-9e28-c4915cc5ed44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 NetworkManager[57207]: <info>  [1771844332.1639] device (tapb32ffd3e-b0): carrier: link connected
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.167 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[a78870d4-d2f9-4505-b7d9-e58321e0235e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.180 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[96ff404a-fbd8-464a-9331-568843c32e85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb32ffd3e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:08:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347633, 'reachable_time': 21628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209815, 'error': None, 'target': 'ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.193 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[168e5b87-3f58-4dff-9503-a013e88795a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe76:874'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347633, 'tstamp': 347633}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209816, 'error': None, 'target': 'ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.206 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e96c4e99-e26c-422b-818a-73654cdc1f39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb32ffd3e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:08:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347633, 'reachable_time': 21628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209817, 'error': None, 'target': 'ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.227 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[20043df4-75af-4aa6-989f-107caeb48f59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.268 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6097d40d-d2e0-4bc1-8935-2eceed6acf69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.270 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb32ffd3e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.270 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.270 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb32ffd3e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.318 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 kernel: tapb32ffd3e-b0: entered promiscuous mode
Feb 23 10:58:52 compute-0 NetworkManager[57207]: <info>  [1771844332.3192] manager: (tapb32ffd3e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.325 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.326 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb32ffd3e-b0, col_values=(('external_ids', {'iface-id': 'e07d7547-aac4-48f7-8453-cf7d0a5b6f12'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.327 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 ovn_controller[97601]: 2026-02-23T10:58:52Z|00063|binding|INFO|Releasing lport e07d7547-aac4-48f7-8453-cf7d0a5b6f12 from this chassis (sb_readonly=0)
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.334 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.335 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.335 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.336 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[21736a21-844f-4f8b-899c-d9dbe3210afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.337 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: global
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf.pid.haproxy
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.338 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'env', 'PROCESS_TAG=haproxy-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.349 187643 DEBUG nova.compute.manager [req-028c5393-5d59-4f29-96e7-9605c10e482d req-144fe3ac-b4af-488e-a248-2ba45b3b1b8e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.349 187643 DEBUG oslo_concurrency.lockutils [req-028c5393-5d59-4f29-96e7-9605c10e482d req-144fe3ac-b4af-488e-a248-2ba45b3b1b8e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.349 187643 DEBUG oslo_concurrency.lockutils [req-028c5393-5d59-4f29-96e7-9605c10e482d req-144fe3ac-b4af-488e-a248-2ba45b3b1b8e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.350 187643 DEBUG oslo_concurrency.lockutils [req-028c5393-5d59-4f29-96e7-9605c10e482d req-144fe3ac-b4af-488e-a248-2ba45b3b1b8e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.350 187643 DEBUG nova.compute.manager [req-028c5393-5d59-4f29-96e7-9605c10e482d req-144fe3ac-b4af-488e-a248-2ba45b3b1b8e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Processing event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.563 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.565 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844332.5622559, c11301ca-f35c-44f3-a976-afb41a5e66c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.566 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] VM Started (Lifecycle Event)
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.568 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.573 187643 INFO nova.virt.libvirt.driver [-] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Instance spawned successfully.
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.573 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.595 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.600 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.602 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.603 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.603 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.603 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.604 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.604 187643 DEBUG nova.virt.libvirt.driver [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 10:58:52 compute-0 podman[209855]: 2026-02-23 10:58:52.64027086 +0000 UTC m=+0.046639233 container create 8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.668 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.669 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844332.562559, c11301ca-f35c-44f3-a976-afb41a5e66c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.669 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] VM Paused (Lifecycle Event)
Feb 23 10:58:52 compute-0 systemd[1]: Started libpod-conmon-8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0.scope.
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:52 compute-0 systemd[1]: Started libcrun container.
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:58:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45cb11ad64d21ece4c4f17e6ba9962a0b06db81233b29ecb8a63251c3d3b4a5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:58:52 compute-0 podman[209855]: 2026-02-23 10:58:52.701428057 +0000 UTC m=+0.107796430 container init 8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:58:52 compute-0 podman[209855]: 2026-02-23 10:58:52.706735567 +0000 UTC m=+0.113103940 container start 8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 10:58:52 compute-0 podman[209855]: 2026-02-23 10:58:52.617633942 +0000 UTC m=+0.024002315 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 10:58:52 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:58:52.721 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:58:52 compute-0 neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf[209871]: [NOTICE]   (209875) : New worker (209877) forked
Feb 23 10:58:52 compute-0 neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf[209871]: [NOTICE]   (209875) : Loading success.
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.764 187643 INFO nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Took 10.72 seconds to spawn the instance on the hypervisor.
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.765 187643 DEBUG nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.809 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.811 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844332.5667517, c11301ca-f35c-44f3-a976-afb41a5e66c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.811 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] VM Resumed (Lifecycle Event)
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.877 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.882 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.895 187643 INFO nova.compute.manager [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Took 11.31 seconds to build instance.
Feb 23 10:58:52 compute-0 nova_compute[187639]: 2026-02-23 10:58:52.918 187643 DEBUG oslo_concurrency.lockutils [None req-0f4a4d6a-d44a-4967-80b3-f840f7c48439 845a17801d704603adef909e3f49b086 737c024c3cf24bc7b040b295c4f6eae9 - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:53 compute-0 nova_compute[187639]: 2026-02-23 10:58:53.211 187643 DEBUG nova.network.neutron [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updated VIF entry in instance network info cache for port ad14dae8-a715-40c6-bcdd-1f3de3f228fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 10:58:53 compute-0 nova_compute[187639]: 2026-02-23 10:58:53.212 187643 DEBUG nova.network.neutron [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updating instance_info_cache with network_info: [{"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:58:53 compute-0 nova_compute[187639]: 2026-02-23 10:58:53.229 187643 DEBUG oslo_concurrency.lockutils [req-5ede1c85-a613-4d45-86bc-f1ba828c4847 req-c714908c-6063-4de0-8a45-5df3d46ad060 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:58:53 compute-0 nova_compute[187639]: 2026-02-23 10:58:53.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:53 compute-0 nova_compute[187639]: 2026-02-23 10:58:53.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:58:53 compute-0 nova_compute[187639]: 2026-02-23 10:58:53.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.200 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.201 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.201 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.202 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid c11301ca-f35c-44f3-a976-afb41a5e66c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.489 187643 DEBUG nova.compute.manager [req-61070151-866f-4a7d-9cc5-45ae59a2cd83 req-f6ac87b9-cfca-424c-ac10-6a06b61f0bbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.490 187643 DEBUG oslo_concurrency.lockutils [req-61070151-866f-4a7d-9cc5-45ae59a2cd83 req-f6ac87b9-cfca-424c-ac10-6a06b61f0bbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.490 187643 DEBUG oslo_concurrency.lockutils [req-61070151-866f-4a7d-9cc5-45ae59a2cd83 req-f6ac87b9-cfca-424c-ac10-6a06b61f0bbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.491 187643 DEBUG oslo_concurrency.lockutils [req-61070151-866f-4a7d-9cc5-45ae59a2cd83 req-f6ac87b9-cfca-424c-ac10-6a06b61f0bbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.491 187643 DEBUG nova.compute.manager [req-61070151-866f-4a7d-9cc5-45ae59a2cd83 req-f6ac87b9-cfca-424c-ac10-6a06b61f0bbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:58:54 compute-0 nova_compute[187639]: 2026-02-23 10:58:54.492 187643 WARNING nova.compute.manager [req-61070151-866f-4a7d-9cc5-45ae59a2cd83 req-f6ac87b9-cfca-424c-ac10-6a06b61f0bbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received unexpected event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with vm_state active and task_state None.
Feb 23 10:58:55 compute-0 nova_compute[187639]: 2026-02-23 10:58:55.085 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.231 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updating instance_info_cache with network_info: [{"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.249 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.262 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.263 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.263 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.263 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.294 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.294 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.295 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.295 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.416 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:56 compute-0 podman[209886]: 2026-02-23 10:58:56.441734529 +0000 UTC m=+0.097330992 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:58:56 compute-0 sshd-session[209888]: Connection closed by authenticating user root 165.227.79.48 port 56076 [preauth]
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.494 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.495 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.538 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.688 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.689 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5728MB free_disk=73.20531845092773GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.690 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.690 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.907 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance c11301ca-f35c-44f3-a976-afb41a5e66c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.908 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.908 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.958 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:58:56 compute-0 nova_compute[187639]: 2026-02-23 10:58:56.977 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:58:57 compute-0 nova_compute[187639]: 2026-02-23 10:58:57.013 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:58:57 compute-0 nova_compute[187639]: 2026-02-23 10:58:57.013 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:58:57 compute-0 nova_compute[187639]: 2026-02-23 10:58:57.441 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:58:59 compute-0 podman[197002]: time="2026-02-23T10:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:58:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 10:58:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2629 "" "Go-http-client/1.1"
Feb 23 10:59:00 compute-0 nova_compute[187639]: 2026-02-23 10:59:00.112 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:01 compute-0 nova_compute[187639]: 2026-02-23 10:59:01.293 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:01 compute-0 openstack_network_exporter[199919]: ERROR   10:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:59:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:59:01 compute-0 openstack_network_exporter[199919]: ERROR   10:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:59:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:59:04 compute-0 ovn_controller[97601]: 2026-02-23T10:59:04Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:87:df 10.100.0.5
Feb 23 10:59:04 compute-0 ovn_controller[97601]: 2026-02-23T10:59:04Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:87:df 10.100.0.5
Feb 23 10:59:04 compute-0 podman[209941]: 2026-02-23 10:59:04.871440252 +0000 UTC m=+0.072495257 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 10:59:05 compute-0 nova_compute[187639]: 2026-02-23 10:59:05.169 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:06 compute-0 nova_compute[187639]: 2026-02-23 10:59:06.295 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:09 compute-0 podman[209960]: 2026-02-23 10:59:09.906464114 +0000 UTC m=+0.099496990 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Feb 23 10:59:10 compute-0 nova_compute[187639]: 2026-02-23 10:59:10.211 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:11 compute-0 nova_compute[187639]: 2026-02-23 10:59:11.325 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:12.640 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:12.642 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:12.643 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:13 compute-0 sshd-session[209987]: Invalid user admin from 143.198.30.3 port 45054
Feb 23 10:59:13 compute-0 sshd-session[209987]: Connection closed by invalid user admin 143.198.30.3 port 45054 [preauth]
Feb 23 10:59:14 compute-0 podman[209989]: 2026-02-23 10:59:14.874713362 +0000 UTC m=+0.070090924 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1770267347, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Feb 23 10:59:15 compute-0 nova_compute[187639]: 2026-02-23 10:59:15.260 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:16 compute-0 nova_compute[187639]: 2026-02-23 10:59:16.375 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:20 compute-0 nova_compute[187639]: 2026-02-23 10:59:20.135 187643 DEBUG nova.compute.manager [None req-66e62987-b3c6-4e8b-9731-4384b38c47f2 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 23 10:59:20 compute-0 nova_compute[187639]: 2026-02-23 10:59:20.204 187643 DEBUG nova.compute.provider_tree [None req-66e62987-b3c6-4e8b-9731-4384b38c47f2 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 5 to 9 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 10:59:20 compute-0 nova_compute[187639]: 2026-02-23 10:59:20.306 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:21 compute-0 nova_compute[187639]: 2026-02-23 10:59:21.377 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:22 compute-0 ovn_controller[97601]: 2026-02-23T10:59:22Z|00064|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 10:59:24 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 23 10:59:24 compute-0 nova_compute[187639]: 2026-02-23 10:59:24.662 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Check if temp file /var/lib/nova/instances/tmpg9jxhx1q exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 23 10:59:24 compute-0 nova_compute[187639]: 2026-02-23 10:59:24.662 187643 DEBUG nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg9jxhx1q',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c11301ca-f35c-44f3-a976-afb41a5e66c9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 23 10:59:25 compute-0 nova_compute[187639]: 2026-02-23 10:59:25.308 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:26 compute-0 nova_compute[187639]: 2026-02-23 10:59:26.378 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:26 compute-0 podman[210011]: 2026-02-23 10:59:26.839399333 +0000 UTC m=+0.040992054 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:59:27 compute-0 nova_compute[187639]: 2026-02-23 10:59:27.901 187643 DEBUG oslo_concurrency.processutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:59:27 compute-0 nova_compute[187639]: 2026-02-23 10:59:27.958 187643 DEBUG oslo_concurrency.processutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:59:27 compute-0 nova_compute[187639]: 2026-02-23 10:59:27.959 187643 DEBUG oslo_concurrency.processutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:59:28 compute-0 nova_compute[187639]: 2026-02-23 10:59:28.035 187643 DEBUG oslo_concurrency.processutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:59:29 compute-0 podman[197002]: time="2026-02-23T10:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:59:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 10:59:29 compute-0 podman[197002]: @ - - [23/Feb/2026:10:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2624 "" "Go-http-client/1.1"
Feb 23 10:59:30 compute-0 nova_compute[187639]: 2026-02-23 10:59:30.333 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:31 compute-0 nova_compute[187639]: 2026-02-23 10:59:31.411 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:31 compute-0 openstack_network_exporter[199919]: ERROR   10:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:59:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:59:31 compute-0 openstack_network_exporter[199919]: ERROR   10:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:59:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 10:59:31 compute-0 sshd-session[210043]: Accepted publickey for nova from 192.168.122.101 port 59676 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 10:59:31 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 10:59:31 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 10:59:31 compute-0 systemd-logind[808]: New session 33 of user nova.
Feb 23 10:59:31 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 10:59:31 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 10:59:31 compute-0 systemd[210047]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:59:31 compute-0 systemd[210047]: Queued start job for default target Main User Target.
Feb 23 10:59:31 compute-0 systemd[210047]: Created slice User Application Slice.
Feb 23 10:59:31 compute-0 systemd[210047]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 10:59:31 compute-0 systemd[210047]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 10:59:31 compute-0 systemd[210047]: Reached target Paths.
Feb 23 10:59:31 compute-0 systemd[210047]: Reached target Timers.
Feb 23 10:59:31 compute-0 systemd[210047]: Starting D-Bus User Message Bus Socket...
Feb 23 10:59:31 compute-0 systemd[210047]: Starting Create User's Volatile Files and Directories...
Feb 23 10:59:31 compute-0 systemd[210047]: Finished Create User's Volatile Files and Directories.
Feb 23 10:59:31 compute-0 systemd[210047]: Listening on D-Bus User Message Bus Socket.
Feb 23 10:59:31 compute-0 systemd[210047]: Reached target Sockets.
Feb 23 10:59:31 compute-0 systemd[210047]: Reached target Basic System.
Feb 23 10:59:31 compute-0 systemd[210047]: Reached target Main User Target.
Feb 23 10:59:31 compute-0 systemd[210047]: Startup finished in 109ms.
Feb 23 10:59:31 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 10:59:31 compute-0 systemd[1]: Started Session 33 of User nova.
Feb 23 10:59:31 compute-0 sshd-session[210043]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 10:59:31 compute-0 sshd-session[210062]: Received disconnect from 192.168.122.101 port 59676:11: disconnected by user
Feb 23 10:59:31 compute-0 sshd-session[210062]: Disconnected from user nova 192.168.122.101 port 59676
Feb 23 10:59:31 compute-0 sshd-session[210043]: pam_unix(sshd:session): session closed for user nova
Feb 23 10:59:31 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Feb 23 10:59:31 compute-0 systemd-logind[808]: Session 33 logged out. Waiting for processes to exit.
Feb 23 10:59:31 compute-0 systemd-logind[808]: Removed session 33.
Feb 23 10:59:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:34.014 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:59:34 compute-0 nova_compute[187639]: 2026-02-23 10:59:34.014 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:34.016 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:59:34 compute-0 nova_compute[187639]: 2026-02-23 10:59:34.071 187643 DEBUG nova.compute.manager [req-718e85b7-dba7-4245-b852-83e9c4fbe98d req-c3464627-e407-49c1-8161-033e1f1e62be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:34 compute-0 nova_compute[187639]: 2026-02-23 10:59:34.072 187643 DEBUG oslo_concurrency.lockutils [req-718e85b7-dba7-4245-b852-83e9c4fbe98d req-c3464627-e407-49c1-8161-033e1f1e62be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:34 compute-0 nova_compute[187639]: 2026-02-23 10:59:34.073 187643 DEBUG oslo_concurrency.lockutils [req-718e85b7-dba7-4245-b852-83e9c4fbe98d req-c3464627-e407-49c1-8161-033e1f1e62be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:34 compute-0 nova_compute[187639]: 2026-02-23 10:59:34.073 187643 DEBUG oslo_concurrency.lockutils [req-718e85b7-dba7-4245-b852-83e9c4fbe98d req-c3464627-e407-49c1-8161-033e1f1e62be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:34 compute-0 nova_compute[187639]: 2026-02-23 10:59:34.074 187643 DEBUG nova.compute.manager [req-718e85b7-dba7-4245-b852-83e9c4fbe98d req-c3464627-e407-49c1-8161-033e1f1e62be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:34 compute-0 nova_compute[187639]: 2026-02-23 10:59:34.074 187643 DEBUG nova.compute.manager [req-718e85b7-dba7-4245-b852-83e9c4fbe98d req-c3464627-e407-49c1-8161-033e1f1e62be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.030 187643 INFO nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Took 6.99 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.031 187643 DEBUG nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.060 187643 DEBUG nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg9jxhx1q',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c11301ca-f35c-44f3-a976-afb41a5e66c9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0b9f03f1-a9dd-410f-883f-59f5196870ac),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.082 187643 DEBUG nova.objects.instance [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid c11301ca-f35c-44f3-a976-afb41a5e66c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.084 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.086 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.086 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.108 187643 DEBUG nova.virt.libvirt.vif [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1348569289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1348569289',id=8,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:58:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='737c024c3cf24bc7b040b295c4f6eae9',ramdisk_id='',reservation_id='r-sndy05jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1128369139',owner_user_name='tempest-TestExecuteBasicStrategy-1128369139-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T10:58:52Z,user_data=None,user_id='845a17801d704603adef909e3f49b086',uuid=c11301ca-f35c-44f3-a976-afb41a5e66c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.108 187643 DEBUG nova.network.os_vif_util [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.109 187643 DEBUG nova.network.os_vif_util [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.109 187643 DEBUG nova.virt.libvirt.migration [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updating guest XML with vif config: <interface type="ethernet">
Feb 23 10:59:35 compute-0 nova_compute[187639]:   <mac address="fa:16:3e:be:87:df"/>
Feb 23 10:59:35 compute-0 nova_compute[187639]:   <model type="virtio"/>
Feb 23 10:59:35 compute-0 nova_compute[187639]:   <driver name="vhost" rx_queue_size="512"/>
Feb 23 10:59:35 compute-0 nova_compute[187639]:   <mtu size="1442"/>
Feb 23 10:59:35 compute-0 nova_compute[187639]:   <target dev="tapad14dae8-a7"/>
Feb 23 10:59:35 compute-0 nova_compute[187639]: </interface>
Feb 23 10:59:35 compute-0 nova_compute[187639]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.109 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.374 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.589 187643 DEBUG nova.virt.libvirt.migration [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.590 187643 INFO nova.virt.libvirt.migration [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 23 10:59:35 compute-0 nova_compute[187639]: 2026-02-23 10:59:35.652 187643 INFO nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 23 10:59:35 compute-0 podman[210064]: 2026-02-23 10:59:35.857792093 +0000 UTC m=+0.056276008 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 10:59:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:36.018 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.189 187643 DEBUG nova.compute.manager [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.189 187643 DEBUG oslo_concurrency.lockutils [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.190 187643 DEBUG oslo_concurrency.lockutils [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.190 187643 DEBUG oslo_concurrency.lockutils [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.190 187643 DEBUG nova.compute.manager [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.190 187643 WARNING nova.compute.manager [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received unexpected event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with vm_state active and task_state migrating.
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.190 187643 DEBUG nova.compute.manager [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-changed-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.190 187643 DEBUG nova.compute.manager [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Refreshing instance network info cache due to event network-changed-ad14dae8-a715-40c6-bcdd-1f3de3f228fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.191 187643 DEBUG oslo_concurrency.lockutils [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.191 187643 DEBUG oslo_concurrency.lockutils [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.191 187643 DEBUG nova.network.neutron [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Refreshing network info cache for port ad14dae8-a715-40c6-bcdd-1f3de3f228fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.192 187643 DEBUG nova.virt.libvirt.migration [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.192 187643 DEBUG nova.virt.libvirt.migration [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.441 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.696 187643 DEBUG nova.virt.libvirt.migration [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.698 187643 DEBUG nova.virt.libvirt.migration [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.803 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844376.80338, c11301ca-f35c-44f3-a976-afb41a5e66c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.804 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] VM Paused (Lifecycle Event)
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.834 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.838 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.856 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 23 10:59:36 compute-0 kernel: tapad14dae8-a7 (unregistering): left promiscuous mode
Feb 23 10:59:36 compute-0 NetworkManager[57207]: <info>  [1771844376.9311] device (tapad14dae8-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 10:59:36 compute-0 ovn_controller[97601]: 2026-02-23T10:59:36Z|00065|binding|INFO|Releasing lport ad14dae8-a715-40c6-bcdd-1f3de3f228fd from this chassis (sb_readonly=0)
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.938 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:36 compute-0 ovn_controller[97601]: 2026-02-23T10:59:36Z|00066|binding|INFO|Setting lport ad14dae8-a715-40c6-bcdd-1f3de3f228fd down in Southbound
Feb 23 10:59:36 compute-0 ovn_controller[97601]: 2026-02-23T10:59:36Z|00067|binding|INFO|Removing iface tapad14dae8-a7 ovn-installed in OVS
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.940 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:36.944 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:87:df 10.100.0.5'], port_security=['fa:16:3e:be:87:df 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '48738a31-ba59-4fc8-acf1-d1f474e97648'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c11301ca-f35c-44f3-a976-afb41a5e66c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '737c024c3cf24bc7b040b295c4f6eae9', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7200f997-db9c-488a-8ca5-9f959124bf4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec51f24c-5ac4-4e70-8d9a-50764c2b9502, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=ad14dae8-a715-40c6-bcdd-1f3de3f228fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:59:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:36.945 106968 INFO neutron.agent.ovn.metadata.agent [-] Port ad14dae8-a715-40c6-bcdd-1f3de3f228fd in datapath b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf unbound from our chassis
Feb 23 10:59:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:36.946 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 10:59:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:36.949 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[11d50945-7028-4010-9ea3-5648560f2f5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:36.950 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf namespace which is not needed anymore
Feb 23 10:59:36 compute-0 nova_compute[187639]: 2026-02-23 10:59:36.955 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:36 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 23 10:59:36 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 12.852s CPU time.
Feb 23 10:59:36 compute-0 systemd-machined[156970]: Machine qemu-5-instance-00000008 terminated.
Feb 23 10:59:37 compute-0 neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf[209871]: [NOTICE]   (209875) : haproxy version is 2.8.14-c23fe91
Feb 23 10:59:37 compute-0 neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf[209871]: [NOTICE]   (209875) : path to executable is /usr/sbin/haproxy
Feb 23 10:59:37 compute-0 neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf[209871]: [WARNING]  (209875) : Exiting Master process...
Feb 23 10:59:37 compute-0 neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf[209871]: [ALERT]    (209875) : Current worker (209877) exited with code 143 (Terminated)
Feb 23 10:59:37 compute-0 neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf[209871]: [WARNING]  (209875) : All workers exited. Exiting... (0)
Feb 23 10:59:37 compute-0 systemd[1]: libpod-8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0.scope: Deactivated successfully.
Feb 23 10:59:37 compute-0 podman[210115]: 2026-02-23 10:59:37.071278961 +0000 UTC m=+0.040963003 container died 8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 10:59:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-45cb11ad64d21ece4c4f17e6ba9962a0b06db81233b29ecb8a63251c3d3b4a5f-merged.mount: Deactivated successfully.
Feb 23 10:59:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0-userdata-shm.mount: Deactivated successfully.
Feb 23 10:59:37 compute-0 podman[210115]: 2026-02-23 10:59:37.095632415 +0000 UTC m=+0.065316437 container cleanup 8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 10:59:37 compute-0 systemd[1]: libpod-conmon-8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0.scope: Deactivated successfully.
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.128 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.132 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.159 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.160 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.160 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 23 10:59:37 compute-0 podman[210142]: 2026-02-23 10:59:37.171816169 +0000 UTC m=+0.056564567 container remove 8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.175 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ff00f655-83c6-4400-bd74-6b6525385248]: (4, ('Mon Feb 23 10:59:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf (8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0)\n8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0\nMon Feb 23 10:59:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf (8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0)\n8d42c0cfc11ddab0ce591c6c9af0af497913135a157c0d5e2f6fec385fae1ae0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.177 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[16b74849-6e35-4ccd-a085-f3e87db71fae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.178 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb32ffd3e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.180 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:37 compute-0 kernel: tapb32ffd3e-b0: left promiscuous mode
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.189 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.189 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.191 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2093033e-1310-41ab-b2e6-f1048fe4efe7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.201 187643 DEBUG nova.virt.libvirt.guest [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'c11301ca-f35c-44f3-a976-afb41a5e66c9' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.202 187643 INFO nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Migration operation has completed
Feb 23 10:59:37 compute-0 nova_compute[187639]: 2026-02-23 10:59:37.202 187643 INFO nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] _post_live_migration() is started..
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.204 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[829ef5b6-760c-4c97-b3a6-4fd9ba43b8ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.205 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[47622b91-040a-4bee-a450-7bc3a2f74206]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.216 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ed513574-8733-44a0-9ee6-437f0e36b862]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347628, 'reachable_time': 44950, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210176, 'error': None, 'target': 'ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.223 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 10:59:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 10:59:37.223 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[7da29cc6-6a14-4f33-a107-7c87f08443e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:59:37 compute-0 systemd[1]: run-netns-ovnmeta\x2db32ffd3e\x2db392\x2d4a63\x2dbbbb\x2d8352b0d3b8cf.mount: Deactivated successfully.
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.211 187643 DEBUG nova.network.neutron [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updated VIF entry in instance network info cache for port ad14dae8-a715-40c6-bcdd-1f3de3f228fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.212 187643 DEBUG nova.network.neutron [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Updating instance_info_cache with network_info: [{"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.277 187643 DEBUG oslo_concurrency.lockutils [req-34fc404a-597f-4551-96c3-0c382e4f33f4 req-c7e1fad9-b7ca-43d1-ac2e-0736e1259ceb 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-c11301ca-f35c-44f3-a976-afb41a5e66c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.302 187643 DEBUG nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.302 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.302 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.303 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.303 187643 DEBUG nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.303 187643 DEBUG nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.303 187643 DEBUG nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.304 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.304 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.304 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.304 187643 DEBUG nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.305 187643 WARNING nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received unexpected event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with vm_state active and task_state migrating.
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.305 187643 DEBUG nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.305 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.305 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.306 187643 DEBUG oslo_concurrency.lockutils [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.306 187643 DEBUG nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.306 187643 WARNING nova.compute.manager [req-ca526060-a2d3-4462-b6f5-5550cf9aed92 req-a72eda24-a9d0-485a-82dc-5f77faa19694 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received unexpected event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with vm_state active and task_state migrating.
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.372 187643 DEBUG nova.compute.manager [req-43568fe2-f270-4043-a899-6e8f3968bd5f req-b78b7bd0-f394-4dc2-a584-9e2171f0b42c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.372 187643 DEBUG oslo_concurrency.lockutils [req-43568fe2-f270-4043-a899-6e8f3968bd5f req-b78b7bd0-f394-4dc2-a584-9e2171f0b42c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.372 187643 DEBUG oslo_concurrency.lockutils [req-43568fe2-f270-4043-a899-6e8f3968bd5f req-b78b7bd0-f394-4dc2-a584-9e2171f0b42c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.373 187643 DEBUG oslo_concurrency.lockutils [req-43568fe2-f270-4043-a899-6e8f3968bd5f req-b78b7bd0-f394-4dc2-a584-9e2171f0b42c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.373 187643 DEBUG nova.compute.manager [req-43568fe2-f270-4043-a899-6e8f3968bd5f req-b78b7bd0-f394-4dc2-a584-9e2171f0b42c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.373 187643 DEBUG nova.compute.manager [req-43568fe2-f270-4043-a899-6e8f3968bd5f req-b78b7bd0-f394-4dc2-a584-9e2171f0b42c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-unplugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.853 187643 DEBUG nova.network.neutron [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Activated binding for port ad14dae8-a715-40c6-bcdd-1f3de3f228fd and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.854 187643 DEBUG nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.854 187643 DEBUG nova.virt.libvirt.vif [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T10:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1348569289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1348569289',id=8,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T10:58:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='737c024c3cf24bc7b040b295c4f6eae9',ramdisk_id='',reservation_id='r-sndy05jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1128369139',owner_user_name='tempest-TestExecuteBasicStrategy-1128369139-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T10:59:21Z,user_data=None,user_id='845a17801d704603adef909e3f49b086',uuid=c11301ca-f35c-44f3-a976-afb41a5e66c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.855 187643 DEBUG nova.network.os_vif_util [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "address": "fa:16:3e:be:87:df", "network": {"id": "b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1593104605-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "737c024c3cf24bc7b040b295c4f6eae9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad14dae8-a7", "ovs_interfaceid": "ad14dae8-a715-40c6-bcdd-1f3de3f228fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.855 187643 DEBUG nova.network.os_vif_util [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.856 187643 DEBUG os_vif [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.857 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.857 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad14dae8-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.904 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.905 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.907 187643 INFO os_vif [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:87:df,bridge_name='br-int',has_traffic_filtering=True,id=ad14dae8-a715-40c6-bcdd-1f3de3f228fd,network=Network(b32ffd3e-b392-4a63-bbbb-8352b0d3b8cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad14dae8-a7')
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.908 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.908 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.908 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.908 187643 DEBUG nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.909 187643 INFO nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Deleting instance files /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9_del
Feb 23 10:59:38 compute-0 nova_compute[187639]: 2026-02-23 10:59:38.909 187643 INFO nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Deletion of /var/lib/nova/instances/c11301ca-f35c-44f3-a976-afb41a5e66c9_del complete
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.375 187643 DEBUG nova.compute.manager [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.376 187643 DEBUG oslo_concurrency.lockutils [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.376 187643 DEBUG oslo_concurrency.lockutils [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.377 187643 DEBUG oslo_concurrency.lockutils [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.377 187643 DEBUG nova.compute.manager [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.378 187643 WARNING nova.compute.manager [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received unexpected event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with vm_state active and task_state migrating.
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.378 187643 DEBUG nova.compute.manager [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.379 187643 DEBUG oslo_concurrency.lockutils [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.379 187643 DEBUG oslo_concurrency.lockutils [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.380 187643 DEBUG oslo_concurrency.lockutils [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.380 187643 DEBUG nova.compute.manager [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] No waiting events found dispatching network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.381 187643 WARNING nova.compute.manager [req-2fe627a5-205f-4057-ab8e-b1239b216add req-7e49d9c7-2381-4781-89c0-0e22ade5bf5f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Received unexpected event network-vif-plugged-ad14dae8-a715-40c6-bcdd-1f3de3f228fd for instance with vm_state active and task_state migrating.
Feb 23 10:59:40 compute-0 nova_compute[187639]: 2026-02-23 10:59:40.412 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:40 compute-0 podman[210178]: 2026-02-23 10:59:40.894814867 +0000 UTC m=+0.092941187 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 10:59:42 compute-0 sshd-session[210204]: Connection closed by authenticating user root 165.227.79.48 port 56282 [preauth]
Feb 23 10:59:42 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 10:59:42 compute-0 systemd[210047]: Activating special unit Exit the Session...
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped target Main User Target.
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped target Basic System.
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped target Paths.
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped target Sockets.
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped target Timers.
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 10:59:42 compute-0 systemd[210047]: Closed D-Bus User Message Bus Socket.
Feb 23 10:59:42 compute-0 systemd[210047]: Stopped Create User's Volatile Files and Directories.
Feb 23 10:59:42 compute-0 systemd[210047]: Removed slice User Application Slice.
Feb 23 10:59:42 compute-0 systemd[210047]: Reached target Shutdown.
Feb 23 10:59:42 compute-0 systemd[210047]: Finished Exit the Session.
Feb 23 10:59:42 compute-0 systemd[210047]: Reached target Exit the Session.
Feb 23 10:59:42 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 10:59:42 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 10:59:42 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 10:59:42 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 10:59:42 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 10:59:42 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 10:59:42 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 10:59:43 compute-0 nova_compute[187639]: 2026-02-23 10:59:43.906 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:44 compute-0 nova_compute[187639]: 2026-02-23 10:59:44.886 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:44 compute-0 nova_compute[187639]: 2026-02-23 10:59:44.887 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:44 compute-0 nova_compute[187639]: 2026-02-23 10:59:44.887 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c11301ca-f35c-44f3-a976-afb41a5e66c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:44 compute-0 nova_compute[187639]: 2026-02-23 10:59:44.911 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:44 compute-0 nova_compute[187639]: 2026-02-23 10:59:44.912 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:44 compute-0 nova_compute[187639]: 2026-02-23 10:59:44.912 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:44 compute-0 nova_compute[187639]: 2026-02-23 10:59:44.912 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:59:45 compute-0 podman[210210]: 2026-02-23 10:59:45.01226898 +0000 UTC m=+0.058004014 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, version=9.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.049 187643 WARNING nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.050 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5816MB free_disk=73.20626831054688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.050 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.051 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.093 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration for instance c11301ca-f35c-44f3-a976-afb41a5e66c9 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.137 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.169 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration 0b9f03f1-a9dd-410f-883f-59f5196870ac is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.170 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.170 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.234 187643 DEBUG nova.compute.provider_tree [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.251 187643 DEBUG nova.scheduler.client.report [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.273 187643 DEBUG nova.compute.resource_tracker [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.274 187643 DEBUG oslo_concurrency.lockutils [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.281 187643 INFO nova.compute.manager [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.360 187643 INFO nova.scheduler.client.report [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Deleted allocation for migration 0b9f03f1-a9dd-410f-883f-59f5196870ac
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.360 187643 DEBUG nova.virt.libvirt.driver [None req-57b2578a-da1f-4820-bae4-556283f0fbbc a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 23 10:59:45 compute-0 nova_compute[187639]: 2026-02-23 10:59:45.414 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:45 compute-0 sshd-session[210231]: Invalid user admin from 143.198.30.3 port 50696
Feb 23 10:59:45 compute-0 sshd-session[210231]: Connection closed by invalid user admin 143.198.30.3 port 50696 [preauth]
Feb 23 10:59:48 compute-0 nova_compute[187639]: 2026-02-23 10:59:48.944 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:50 compute-0 nova_compute[187639]: 2026-02-23 10:59:50.446 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:50 compute-0 nova_compute[187639]: 2026-02-23 10:59:50.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:52 compute-0 nova_compute[187639]: 2026-02-23 10:59:52.162 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844377.1548343, c11301ca-f35c-44f3-a976-afb41a5e66c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 10:59:52 compute-0 nova_compute[187639]: 2026-02-23 10:59:52.162 187643 INFO nova.compute.manager [-] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] VM Stopped (Lifecycle Event)
Feb 23 10:59:52 compute-0 nova_compute[187639]: 2026-02-23 10:59:52.200 187643 DEBUG nova.compute.manager [None req-0db6bb77-8aa3-4bf6-8a2d-1694a6537338 - - - - - -] [instance: c11301ca-f35c-44f3-a976-afb41a5e66c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 10:59:52 compute-0 nova_compute[187639]: 2026-02-23 10:59:52.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.718 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.718 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.719 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:53 compute-0 nova_compute[187639]: 2026-02-23 10:59:53.945 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.721 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.721 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.722 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.722 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.901 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.902 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=73.20624923706055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.903 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.903 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.987 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:59:54 compute-0 nova_compute[187639]: 2026-02-23 10:59:54.987 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:59:55 compute-0 nova_compute[187639]: 2026-02-23 10:59:55.009 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:59:55 compute-0 nova_compute[187639]: 2026-02-23 10:59:55.024 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:59:55 compute-0 nova_compute[187639]: 2026-02-23 10:59:55.026 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:59:55 compute-0 nova_compute[187639]: 2026-02-23 10:59:55.026 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:59:55 compute-0 nova_compute[187639]: 2026-02-23 10:59:55.447 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:57 compute-0 podman[210233]: 2026-02-23 10:59:57.860491151 +0000 UTC m=+0.054002639 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:59:58 compute-0 nova_compute[187639]: 2026-02-23 10:59:58.947 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:59:59 compute-0 nova_compute[187639]: 2026-02-23 10:59:59.025 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:59 compute-0 nova_compute[187639]: 2026-02-23 10:59:59.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:59:59 compute-0 podman[197002]: time="2026-02-23T10:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:59:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 10:59:59 compute-0 podman[197002]: @ - - [23/Feb/2026:10:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Feb 23 11:00:00 compute-0 nova_compute[187639]: 2026-02-23 11:00:00.448 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:01 compute-0 openstack_network_exporter[199919]: ERROR   11:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:00:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:00:01 compute-0 openstack_network_exporter[199919]: ERROR   11:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:00:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:00:03 compute-0 nova_compute[187639]: 2026-02-23 11:00:03.950 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:05 compute-0 nova_compute[187639]: 2026-02-23 11:00:05.450 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:06 compute-0 podman[210259]: 2026-02-23 11:00:06.846475203 +0000 UTC m=+0.048047970 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 11:00:08 compute-0 nova_compute[187639]: 2026-02-23 11:00:08.953 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:10 compute-0 nova_compute[187639]: 2026-02-23 11:00:10.452 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:11 compute-0 nova_compute[187639]: 2026-02-23 11:00:11.027 187643 DEBUG nova.compute.manager [None req-c49e2a8a-1bc3-4dab-84b8-c945f59baa6e d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 23 11:00:11 compute-0 nova_compute[187639]: 2026-02-23 11:00:11.098 187643 DEBUG nova.compute.provider_tree [None req-c49e2a8a-1bc3-4dab-84b8-c945f59baa6e d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 9 to 12 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 11:00:11 compute-0 podman[210278]: 2026-02-23 11:00:11.86397734 +0000 UTC m=+0.066382595 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:00:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:00:12.642 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:00:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:00:12.642 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:00:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:00:12.642 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:00:13 compute-0 nova_compute[187639]: 2026-02-23 11:00:13.956 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:15 compute-0 nova_compute[187639]: 2026-02-23 11:00:15.179 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:15 compute-0 nova_compute[187639]: 2026-02-23 11:00:15.453 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:15 compute-0 podman[210305]: 2026-02-23 11:00:15.845303856 +0000 UTC m=+0.049660464 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 11:00:18 compute-0 nova_compute[187639]: 2026-02-23 11:00:18.972 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:20 compute-0 nova_compute[187639]: 2026-02-23 11:00:20.496 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:21 compute-0 sshd-session[210327]: Invalid user admin from 143.198.30.3 port 33046
Feb 23 11:00:21 compute-0 sshd-session[210327]: Connection closed by invalid user admin 143.198.30.3 port 33046 [preauth]
Feb 23 11:00:23 compute-0 nova_compute[187639]: 2026-02-23 11:00:23.975 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:25 compute-0 nova_compute[187639]: 2026-02-23 11:00:25.541 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:28 compute-0 sshd-session[210329]: Connection closed by authenticating user root 165.227.79.48 port 53814 [preauth]
Feb 23 11:00:28 compute-0 podman[210331]: 2026-02-23 11:00:28.862276508 +0000 UTC m=+0.057857070 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:00:28 compute-0 nova_compute[187639]: 2026-02-23 11:00:28.978 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:29 compute-0 podman[197002]: time="2026-02-23T11:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:00:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:00:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 23 11:00:30 compute-0 nova_compute[187639]: 2026-02-23 11:00:30.588 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:31 compute-0 openstack_network_exporter[199919]: ERROR   11:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:00:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:00:31 compute-0 openstack_network_exporter[199919]: ERROR   11:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:00:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:00:33 compute-0 nova_compute[187639]: 2026-02-23 11:00:33.980 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:35 compute-0 nova_compute[187639]: 2026-02-23 11:00:35.588 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:00:36.240 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:00:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:00:36.242 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:00:36 compute-0 nova_compute[187639]: 2026-02-23 11:00:36.241 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:37 compute-0 podman[210355]: 2026-02-23 11:00:37.859813982 +0000 UTC m=+0.058046415 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 11:00:38 compute-0 nova_compute[187639]: 2026-02-23 11:00:38.983 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:40 compute-0 nova_compute[187639]: 2026-02-23 11:00:40.591 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:42 compute-0 podman[210375]: 2026-02-23 11:00:42.922730629 +0000 UTC m=+0.124790791 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 11:00:43 compute-0 nova_compute[187639]: 2026-02-23 11:00:43.985 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:00:45.244 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:00:45 compute-0 nova_compute[187639]: 2026-02-23 11:00:45.600 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:46 compute-0 podman[210401]: 2026-02-23 11:00:46.860127957 +0000 UTC m=+0.063227167 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Feb 23 11:00:49 compute-0 nova_compute[187639]: 2026-02-23 11:00:49.004 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:49 compute-0 ovn_controller[97601]: 2026-02-23T11:00:49Z|00068|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Feb 23 11:00:50 compute-0 nova_compute[187639]: 2026-02-23 11:00:50.648 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:51 compute-0 nova_compute[187639]: 2026-02-23 11:00:51.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:52 compute-0 nova_compute[187639]: 2026-02-23 11:00:52.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:54 compute-0 nova_compute[187639]: 2026-02-23 11:00:54.007 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.651 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:00:55 compute-0 sshd-session[210423]: Invalid user admin from 143.198.30.3 port 33030
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.715 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.716 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:55 compute-0 nova_compute[187639]: 2026-02-23 11:00:55.716 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:55 compute-0 sshd-session[210423]: Connection closed by invalid user admin 143.198.30.3 port 33030 [preauth]
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.717 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.718 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.840 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.841 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5834MB free_disk=73.20602035522461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.841 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.841 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.931 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.931 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.948 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.970 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.970 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 11:00:56 compute-0 nova_compute[187639]: 2026-02-23 11:00:56.986 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 11:00:57 compute-0 nova_compute[187639]: 2026-02-23 11:00:57.008 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 11:00:57 compute-0 nova_compute[187639]: 2026-02-23 11:00:57.028 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:00:57 compute-0 nova_compute[187639]: 2026-02-23 11:00:57.046 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:00:57 compute-0 nova_compute[187639]: 2026-02-23 11:00:57.048 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:00:57 compute-0 nova_compute[187639]: 2026-02-23 11:00:57.048 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:00:59 compute-0 nova_compute[187639]: 2026-02-23 11:00:59.010 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:00:59 compute-0 nova_compute[187639]: 2026-02-23 11:00:59.049 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:00:59 compute-0 podman[197002]: time="2026-02-23T11:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:00:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:00:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 23 11:00:59 compute-0 podman[210425]: 2026-02-23 11:00:59.862974565 +0000 UTC m=+0.060877817 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:01:00 compute-0 nova_compute[187639]: 2026-02-23 11:01:00.700 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:01 compute-0 CROND[210451]: (root) CMD (run-parts /etc/cron.hourly)
Feb 23 11:01:01 compute-0 run-parts[210454]: (/etc/cron.hourly) starting 0anacron
Feb 23 11:01:01 compute-0 anacron[210462]: Anacron started on 2026-02-23
Feb 23 11:01:01 compute-0 anacron[210462]: Will run job `cron.daily' in 42 min.
Feb 23 11:01:01 compute-0 anacron[210462]: Will run job `cron.weekly' in 62 min.
Feb 23 11:01:01 compute-0 anacron[210462]: Will run job `cron.monthly' in 82 min.
Feb 23 11:01:01 compute-0 anacron[210462]: Jobs will be executed sequentially
Feb 23 11:01:01 compute-0 openstack_network_exporter[199919]: ERROR   11:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:01:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:01:01 compute-0 openstack_network_exporter[199919]: ERROR   11:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:01:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:01:01 compute-0 run-parts[210464]: (/etc/cron.hourly) finished 0anacron
Feb 23 11:01:01 compute-0 CROND[210450]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 23 11:01:04 compute-0 nova_compute[187639]: 2026-02-23 11:01:04.058 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:05 compute-0 nova_compute[187639]: 2026-02-23 11:01:05.702 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:08 compute-0 podman[210465]: 2026-02-23 11:01:08.860614112 +0000 UTC m=+0.060561488 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:01:09 compute-0 nova_compute[187639]: 2026-02-23 11:01:09.060 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:10 compute-0 nova_compute[187639]: 2026-02-23 11:01:10.741 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:12.643 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:12.643 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:12.644 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:13 compute-0 podman[210484]: 2026-02-23 11:01:13.856182899 +0000 UTC m=+0.064518759 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:01:13 compute-0 sshd-session[210503]: Connection closed by authenticating user root 165.227.79.48 port 35104 [preauth]
Feb 23 11:01:14 compute-0 nova_compute[187639]: 2026-02-23 11:01:14.113 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:15 compute-0 nova_compute[187639]: 2026-02-23 11:01:15.743 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:17 compute-0 podman[210513]: 2026-02-23 11:01:17.838400306 +0000 UTC m=+0.045390643 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1770267347, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 11:01:19 compute-0 nova_compute[187639]: 2026-02-23 11:01:19.115 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:20 compute-0 nova_compute[187639]: 2026-02-23 11:01:20.745 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:24 compute-0 nova_compute[187639]: 2026-02-23 11:01:24.151 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:25 compute-0 nova_compute[187639]: 2026-02-23 11:01:25.747 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:28 compute-0 sshd-session[210536]: Invalid user admin from 143.198.30.3 port 58612
Feb 23 11:01:28 compute-0 sshd-session[210536]: Connection closed by invalid user admin 143.198.30.3 port 58612 [preauth]
Feb 23 11:01:29 compute-0 nova_compute[187639]: 2026-02-23 11:01:29.195 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:29 compute-0 podman[197002]: time="2026-02-23T11:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:01:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:01:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 23 11:01:30 compute-0 nova_compute[187639]: 2026-02-23 11:01:30.757 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:30 compute-0 podman[210538]: 2026-02-23 11:01:30.854813065 +0000 UTC m=+0.056693420 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:01:31 compute-0 openstack_network_exporter[199919]: ERROR   11:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:01:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:01:31 compute-0 openstack_network_exporter[199919]: ERROR   11:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:01:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:01:31 compute-0 nova_compute[187639]: 2026-02-23 11:01:31.888 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:31 compute-0 nova_compute[187639]: 2026-02-23 11:01:31.889 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:31 compute-0 nova_compute[187639]: 2026-02-23 11:01:31.911 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:01:31 compute-0 nova_compute[187639]: 2026-02-23 11:01:31.989 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:31 compute-0 nova_compute[187639]: 2026-02-23 11:01:31.990 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:31.999 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.000 187643 INFO nova.compute.claims [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.095 187643 DEBUG nova.compute.provider_tree [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.112 187643 DEBUG nova.scheduler.client.report [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.135 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.136 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.182 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.182 187643 DEBUG nova.network.neutron [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.207 187643 INFO nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.227 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.332 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.334 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.334 187643 INFO nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Creating image(s)
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.336 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquiring lock "/var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.336 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "/var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.337 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "/var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.362 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.382 187643 DEBUG nova.policy [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ce5433cd709d4cf09187dc41a0817d24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9daefd52b5d441f4aa7111891776971e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.434 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.435 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.436 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.461 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.517 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.518 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.542 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.543 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.544 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.599 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.600 187643 DEBUG nova.virt.disk.api [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Checking if we can resize image /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.601 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.677 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.678 187643 DEBUG nova.virt.disk.api [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Cannot resize image /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.679 187643 DEBUG nova.objects.instance [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lazy-loading 'migration_context' on Instance uuid 140b123c-947b-432d-ad6e-6ffa17bd6ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.701 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.701 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Ensure instance console log exists: /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.702 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.703 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:32 compute-0 nova_compute[187639]: 2026-02-23 11:01:32.703 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.001 187643 DEBUG nova.network.neutron [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Successfully created port: 779b5d5e-1b21-416e-8082-24b46c4297d8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.787 187643 DEBUG nova.network.neutron [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Successfully updated port: 779b5d5e-1b21-416e-8082-24b46c4297d8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.805 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquiring lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.805 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquired lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.806 187643 DEBUG nova.network.neutron [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.877 187643 DEBUG nova.compute.manager [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-changed-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.878 187643 DEBUG nova.compute.manager [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Refreshing instance network info cache due to event network-changed-779b5d5e-1b21-416e-8082-24b46c4297d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:01:33 compute-0 nova_compute[187639]: 2026-02-23 11:01:33.878 187643 DEBUG oslo_concurrency.lockutils [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:01:34 compute-0 nova_compute[187639]: 2026-02-23 11:01:34.198 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:35 compute-0 nova_compute[187639]: 2026-02-23 11:01:35.759 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:35 compute-0 nova_compute[187639]: 2026-02-23 11:01:35.851 187643 DEBUG nova.network.neutron [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.898 187643 DEBUG nova.network.neutron [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updating instance_info_cache with network_info: [{"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.977 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Releasing lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.977 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Instance network_info: |[{"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.977 187643 DEBUG oslo_concurrency.lockutils [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.978 187643 DEBUG nova.network.neutron [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Refreshing network info cache for port 779b5d5e-1b21-416e-8082-24b46c4297d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.980 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Start _get_guest_xml network_info=[{"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.984 187643 WARNING nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.988 187643 DEBUG nova.virt.libvirt.host [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.988 187643 DEBUG nova.virt.libvirt.host [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.990 187643 DEBUG nova.virt.libvirt.host [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.990 187643 DEBUG nova.virt.libvirt.host [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.991 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.992 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.992 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.992 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.992 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.993 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.993 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.993 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.993 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.993 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.994 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.994 187643 DEBUG nova.virt.hardware [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.997 187643 DEBUG nova.virt.libvirt.vif [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1195638216',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1195638216',id=9,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9daefd52b5d441f4aa7111891776971e',ramdisk_id='',reservation_id='r-wig1m040',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:01:32Z,user_data=None,user_id='ce5433cd709d4cf09187dc41a0817d24',uuid=140b123c-947b-432d-ad6e-6ffa17bd6ac8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.997 187643 DEBUG nova.network.os_vif_util [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Converting VIF {"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.998 187643 DEBUG nova.network.os_vif_util [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:01:38 compute-0 nova_compute[187639]: 2026-02-23 11:01:38.998 187643 DEBUG nova.objects.instance [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lazy-loading 'pci_devices' on Instance uuid 140b123c-947b-432d-ad6e-6ffa17bd6ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.034 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <uuid>140b123c-947b-432d-ad6e-6ffa17bd6ac8</uuid>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <name>instance-00000009</name>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1195638216</nova:name>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:01:38</nova:creationTime>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:user uuid="ce5433cd709d4cf09187dc41a0817d24">tempest-TestExecuteHostMaintenanceStrategy-1409397465-project-member</nova:user>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:project uuid="9daefd52b5d441f4aa7111891776971e">tempest-TestExecuteHostMaintenanceStrategy-1409397465</nova:project>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         <nova:port uuid="779b5d5e-1b21-416e-8082-24b46c4297d8">
Feb 23 11:01:39 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <system>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <entry name="serial">140b123c-947b-432d-ad6e-6ffa17bd6ac8</entry>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <entry name="uuid">140b123c-947b-432d-ad6e-6ffa17bd6ac8</entry>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </system>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <os>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   </os>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <features>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   </features>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk.config"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:d8:c0:d3"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <target dev="tap779b5d5e-1b"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/console.log" append="off"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <video>
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </video>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:01:39 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:01:39 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:01:39 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:01:39 compute-0 nova_compute[187639]: </domain>
Feb 23 11:01:39 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.035 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Preparing to wait for external event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.035 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.035 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.035 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.036 187643 DEBUG nova.virt.libvirt.vif [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1195638216',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1195638216',id=9,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9daefd52b5d441f4aa7111891776971e',ramdisk_id='',reservation_id='r-wig1m040',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:01:32Z,user_data=None,user_id='ce5433cd709d4cf09187dc41a0817d24',uuid=140b123c-947b-432d-ad6e-6ffa17bd6ac8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.036 187643 DEBUG nova.network.os_vif_util [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Converting VIF {"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.037 187643 DEBUG nova.network.os_vif_util [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.037 187643 DEBUG os_vif [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.038 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.038 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.039 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.041 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.041 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap779b5d5e-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.041 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap779b5d5e-1b, col_values=(('external_ids', {'iface-id': '779b5d5e-1b21-416e-8082-24b46c4297d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:c0:d3', 'vm-uuid': '140b123c-947b-432d-ad6e-6ffa17bd6ac8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.043 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.045 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:01:39 compute-0 NetworkManager[57207]: <info>  [1771844499.0461] manager: (tap779b5d5e-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.048 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.050 187643 INFO os_vif [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b')
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.113 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.114 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.114 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] No VIF found with MAC fa:16:3e:d8:c0:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:01:39 compute-0 nova_compute[187639]: 2026-02-23 11:01:39.115 187643 INFO nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Using config drive
Feb 23 11:01:39 compute-0 podman[210580]: 2026-02-23 11:01:39.856766622 +0000 UTC m=+0.059923532 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.022 187643 INFO nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Creating config drive at /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk.config
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.029 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp7mw2dxxb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.152 187643 DEBUG oslo_concurrency.processutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp7mw2dxxb" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:40 compute-0 kernel: tap779b5d5e-1b: entered promiscuous mode
Feb 23 11:01:40 compute-0 NetworkManager[57207]: <info>  [1771844500.2065] manager: (tap779b5d5e-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.209 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 ovn_controller[97601]: 2026-02-23T11:01:40Z|00069|binding|INFO|Claiming lport 779b5d5e-1b21-416e-8082-24b46c4297d8 for this chassis.
Feb 23 11:01:40 compute-0 ovn_controller[97601]: 2026-02-23T11:01:40Z|00070|binding|INFO|779b5d5e-1b21-416e-8082-24b46c4297d8: Claiming fa:16:3e:d8:c0:d3 10.100.0.11
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.217 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.230 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:c0:d3 10.100.0.11'], port_security=['fa:16:3e:d8:c0:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '140b123c-947b-432d-ad6e-6ffa17bd6ac8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9daefd52b5d441f4aa7111891776971e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a18454e6-607d-414a-9c16-b6c32d0427cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a4f6995-4ff2-4f5f-8e19-26ce1a3fe182, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=779b5d5e-1b21-416e-8082-24b46c4297d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.232 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 779b5d5e-1b21-416e-8082-24b46c4297d8 in datapath ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1 bound to our chassis
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.235 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1
Feb 23 11:01:40 compute-0 systemd-udevd[210617]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:01:40 compute-0 systemd-machined[156970]: New machine qemu-6-instance-00000009.
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.243 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[58e17aa8-9698-46cc-8741-e72f543aaa9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.244 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapba01db54-f1 in ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.246 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapba01db54-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.246 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ec6204f5-bea7-4501-950e-0ff0d330c64c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.248 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0c5965-9e70-4962-a976-f9a876fafb96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000009.
Feb 23 11:01:40 compute-0 NetworkManager[57207]: <info>  [1771844500.2537] device (tap779b5d5e-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:01:40 compute-0 NetworkManager[57207]: <info>  [1771844500.2547] device (tap779b5d5e-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:01:40 compute-0 ovn_controller[97601]: 2026-02-23T11:01:40Z|00071|binding|INFO|Setting lport 779b5d5e-1b21-416e-8082-24b46c4297d8 ovn-installed in OVS
Feb 23 11:01:40 compute-0 ovn_controller[97601]: 2026-02-23T11:01:40Z|00072|binding|INFO|Setting lport 779b5d5e-1b21-416e-8082-24b46c4297d8 up in Southbound
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.257 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.258 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[24ce3e34-489b-49e2-ac35-dcf5f4893051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.273 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[eff296a6-61bd-4082-823e-0d00100b02fb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.296 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[e980b45d-7cd7-474e-8aa8-fd780cc9fdea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 systemd-udevd[210621]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.301 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[32f8a424-6d2c-45f5-bc42-fdbb365d1266]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 NetworkManager[57207]: <info>  [1771844500.3039] manager: (tapba01db54-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.330 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8e4d51-5952-45f5-9103-ec8ca7201d99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.334 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[3af1d6bb-268e-4628-9a74-9e82ba67ab64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 NetworkManager[57207]: <info>  [1771844500.3522] device (tapba01db54-f0): carrier: link connected
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.357 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[088c6f76-250a-4a27-a596-990ae8a4b43a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.373 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6ebeb107-6316-4275-b212-9eb835750f6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba01db54-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:70:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364451, 'reachable_time': 33326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210650, 'error': None, 'target': 'ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.384 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a587cb97-f0c1-490f-a419-e2ca6ce2dac5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:702b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364451, 'tstamp': 364451}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210651, 'error': None, 'target': 'ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.398 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4222e5-1d75-4b28-b367-0ce11225fb35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba01db54-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:70:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364451, 'reachable_time': 33326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210652, 'error': None, 'target': 'ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.420 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a9cb69b8-6cfa-4351-b77e-450bbbfe9f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.469 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[756e8590-93b5-4c2b-a226-1e47406eb552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.470 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba01db54-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.471 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.472 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba01db54-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.509 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 NetworkManager[57207]: <info>  [1771844500.5106] manager: (tapba01db54-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Feb 23 11:01:40 compute-0 kernel: tapba01db54-f0: entered promiscuous mode
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.511 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.513 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapba01db54-f0, col_values=(('external_ids', {'iface-id': '28d8df62-59d1-413e-a0df-3895c332b63c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.514 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 ovn_controller[97601]: 2026-02-23T11:01:40Z|00073|binding|INFO|Releasing lport 28d8df62-59d1-413e-a0df-3895c332b63c from this chassis (sb_readonly=0)
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.518 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.519 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.520 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[842d9de7-ac79-47f2-827a-4a8174a8a45c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.521 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1.pid.haproxy
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:01:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:40.522 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'env', 'PROCESS_TAG=haproxy-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:01:40 compute-0 nova_compute[187639]: 2026-02-23 11:01:40.760 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:40 compute-0 podman[210685]: 2026-02-23 11:01:40.822366939 +0000 UTC m=+0.048263887 container create 7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:01:40 compute-0 systemd[1]: Started libpod-conmon-7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14.scope.
Feb 23 11:01:40 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:01:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a84ff0255a2f21dcecf1a5fb917605fa6a9da3c032fc9f1930dd07ef3c744e8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:01:40 compute-0 podman[210685]: 2026-02-23 11:01:40.873757784 +0000 UTC m=+0.099654742 container init 7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:01:40 compute-0 podman[210685]: 2026-02-23 11:01:40.877403237 +0000 UTC m=+0.103300175 container start 7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:01:40 compute-0 podman[210685]: 2026-02-23 11:01:40.797794114 +0000 UTC m=+0.023691072 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:01:40 compute-0 neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1[210701]: [NOTICE]   (210705) : New worker (210707) forked
Feb 23 11:01:40 compute-0 neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1[210701]: [NOTICE]   (210705) : Loading success.
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.112 187643 DEBUG nova.compute.manager [req-1baf8c65-0665-4d97-b69b-8dd5708bd2bb req-2d503dd8-3f53-4488-a859-b3cb9f13fc2d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.113 187643 DEBUG oslo_concurrency.lockutils [req-1baf8c65-0665-4d97-b69b-8dd5708bd2bb req-2d503dd8-3f53-4488-a859-b3cb9f13fc2d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.113 187643 DEBUG oslo_concurrency.lockutils [req-1baf8c65-0665-4d97-b69b-8dd5708bd2bb req-2d503dd8-3f53-4488-a859-b3cb9f13fc2d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.113 187643 DEBUG oslo_concurrency.lockutils [req-1baf8c65-0665-4d97-b69b-8dd5708bd2bb req-2d503dd8-3f53-4488-a859-b3cb9f13fc2d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.114 187643 DEBUG nova.compute.manager [req-1baf8c65-0665-4d97-b69b-8dd5708bd2bb req-2d503dd8-3f53-4488-a859-b3cb9f13fc2d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Processing event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:01:41 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:41.234 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.234 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:41 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:41.236 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.401 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844501.4008157, 140b123c-947b-432d-ad6e-6ffa17bd6ac8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.401 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] VM Started (Lifecycle Event)
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.404 187643 DEBUG nova.network.neutron [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updated VIF entry in instance network info cache for port 779b5d5e-1b21-416e-8082-24b46c4297d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.405 187643 DEBUG nova.network.neutron [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updating instance_info_cache with network_info: [{"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.406 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.409 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.413 187643 INFO nova.virt.libvirt.driver [-] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Instance spawned successfully.
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.413 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.428 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.429 187643 DEBUG oslo_concurrency.lockutils [req-9b7bb51b-b833-4558-b37b-4f7b1523824e req-b625ed47-176c-4d48-ad51-97d3df1ce9ba 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.433 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.436 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.436 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.436 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.436 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.437 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.437 187643 DEBUG nova.virt.libvirt.driver [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.459 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.459 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844501.4010046, 140b123c-947b-432d-ad6e-6ffa17bd6ac8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.460 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] VM Paused (Lifecycle Event)
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.483 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.487 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844501.4087253, 140b123c-947b-432d-ad6e-6ffa17bd6ac8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.487 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] VM Resumed (Lifecycle Event)
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.495 187643 INFO nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Took 9.16 seconds to spawn the instance on the hypervisor.
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.495 187643 DEBUG nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.502 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.505 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.535 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.559 187643 INFO nova.compute.manager [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Took 9.60 seconds to build instance.
Feb 23 11:01:41 compute-0 nova_compute[187639]: 2026-02-23 11:01:41.573 187643 DEBUG oslo_concurrency.lockutils [None req-d2d7aa9e-ab65-4569-bc20-3ca3046ce232 ce5433cd709d4cf09187dc41a0817d24 9daefd52b5d441f4aa7111891776971e - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:43 compute-0 nova_compute[187639]: 2026-02-23 11:01:43.240 187643 DEBUG nova.compute.manager [req-f2ce9366-cab5-4e1e-8273-dea080d7ecfb req-e18c7004-dfdf-4ac8-9ab8-f6766858e7fd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:01:43 compute-0 nova_compute[187639]: 2026-02-23 11:01:43.241 187643 DEBUG oslo_concurrency.lockutils [req-f2ce9366-cab5-4e1e-8273-dea080d7ecfb req-e18c7004-dfdf-4ac8-9ab8-f6766858e7fd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:43 compute-0 nova_compute[187639]: 2026-02-23 11:01:43.241 187643 DEBUG oslo_concurrency.lockutils [req-f2ce9366-cab5-4e1e-8273-dea080d7ecfb req-e18c7004-dfdf-4ac8-9ab8-f6766858e7fd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:43 compute-0 nova_compute[187639]: 2026-02-23 11:01:43.242 187643 DEBUG oslo_concurrency.lockutils [req-f2ce9366-cab5-4e1e-8273-dea080d7ecfb req-e18c7004-dfdf-4ac8-9ab8-f6766858e7fd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:43 compute-0 nova_compute[187639]: 2026-02-23 11:01:43.242 187643 DEBUG nova.compute.manager [req-f2ce9366-cab5-4e1e-8273-dea080d7ecfb req-e18c7004-dfdf-4ac8-9ab8-f6766858e7fd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:01:43 compute-0 nova_compute[187639]: 2026-02-23 11:01:43.243 187643 WARNING nova.compute.manager [req-f2ce9366-cab5-4e1e-8273-dea080d7ecfb req-e18c7004-dfdf-4ac8-9ab8-f6766858e7fd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received unexpected event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with vm_state active and task_state None.
Feb 23 11:01:44 compute-0 nova_compute[187639]: 2026-02-23 11:01:44.043 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:44 compute-0 podman[210723]: 2026-02-23 11:01:44.881858387 +0000 UTC m=+0.085932403 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 11:01:45 compute-0 nova_compute[187639]: 2026-02-23 11:01:45.763 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:01:46.238 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:01:48 compute-0 podman[210750]: 2026-02-23 11:01:48.867656415 +0000 UTC m=+0.062515819 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible)
Feb 23 11:01:49 compute-0 nova_compute[187639]: 2026-02-23 11:01:49.046 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:50 compute-0 nova_compute[187639]: 2026-02-23 11:01:50.764 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:52 compute-0 nova_compute[187639]: 2026-02-23 11:01:52.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:52 compute-0 ovn_controller[97601]: 2026-02-23T11:01:52Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:c0:d3 10.100.0.11
Feb 23 11:01:52 compute-0 ovn_controller[97601]: 2026-02-23T11:01:52Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:c0:d3 10.100.0.11
Feb 23 11:01:53 compute-0 nova_compute[187639]: 2026-02-23 11:01:53.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:54 compute-0 nova_compute[187639]: 2026-02-23 11:01:54.049 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:55 compute-0 nova_compute[187639]: 2026-02-23 11:01:55.766 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:56 compute-0 nova_compute[187639]: 2026-02-23 11:01:56.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:56 compute-0 nova_compute[187639]: 2026-02-23 11:01:56.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:56 compute-0 nova_compute[187639]: 2026-02-23 11:01:56.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:57 compute-0 nova_compute[187639]: 2026-02-23 11:01:57.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:57 compute-0 nova_compute[187639]: 2026-02-23 11:01:57.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:01:57 compute-0 nova_compute[187639]: 2026-02-23 11:01:57.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:01:57 compute-0 nova_compute[187639]: 2026-02-23 11:01:57.890 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:01:57 compute-0 nova_compute[187639]: 2026-02-23 11:01:57.890 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:01:57 compute-0 nova_compute[187639]: 2026-02-23 11:01:57.891 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:01:57 compute-0 nova_compute[187639]: 2026-02-23 11:01:57.891 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 140b123c-947b-432d-ad6e-6ffa17bd6ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:01:58 compute-0 sshd-session[210786]: Connection closed by authenticating user root 165.227.79.48 port 52448 [preauth]
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.084 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.194 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updating instance_info_cache with network_info: [{"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.215 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.215 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.216 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.216 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.216 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.238 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.238 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.239 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.239 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.341 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.384 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.386 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.438 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.606 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.607 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5641MB free_disk=73.17728042602539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.608 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.608 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.682 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance 140b123c-947b-432d-ad6e-6ffa17bd6ac8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.683 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.683 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.719 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.737 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:01:59 compute-0 podman[197002]: time="2026-02-23T11:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:01:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:01:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.768 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:01:59 compute-0 nova_compute[187639]: 2026-02-23 11:01:59.768 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:00 compute-0 nova_compute[187639]: 2026-02-23 11:02:00.769 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:01 compute-0 nova_compute[187639]: 2026-02-23 11:02:01.243 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:01 compute-0 openstack_network_exporter[199919]: ERROR   11:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:02:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:02:01 compute-0 openstack_network_exporter[199919]: ERROR   11:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:02:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:02:01 compute-0 nova_compute[187639]: 2026-02-23 11:02:01.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:01 compute-0 podman[210795]: 2026-02-23 11:02:01.862559222 +0000 UTC m=+0.063036042 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:02:02 compute-0 sshd-session[210820]: Invalid user admin from 143.198.30.3 port 42390
Feb 23 11:02:02 compute-0 sshd-session[210820]: Connection closed by invalid user admin 143.198.30.3 port 42390 [preauth]
Feb 23 11:02:04 compute-0 nova_compute[187639]: 2026-02-23 11:02:04.087 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:05 compute-0 nova_compute[187639]: 2026-02-23 11:02:05.771 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:06 compute-0 nova_compute[187639]: 2026-02-23 11:02:06.417 187643 DEBUG nova.compute.manager [None req-f1f70999-9eb0-4704-80de-953a945ad55d a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 23 11:02:06 compute-0 nova_compute[187639]: 2026-02-23 11:02:06.475 187643 DEBUG nova.compute.provider_tree [None req-f1f70999-9eb0-4704-80de-953a945ad55d a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 12 to 14 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 11:02:09 compute-0 nova_compute[187639]: 2026-02-23 11:02:09.089 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:10 compute-0 nova_compute[187639]: 2026-02-23 11:02:10.773 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:10 compute-0 nova_compute[187639]: 2026-02-23 11:02:10.795 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Check if temp file /var/lib/nova/instances/tmpxo2fsxy3 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 23 11:02:10 compute-0 nova_compute[187639]: 2026-02-23 11:02:10.796 187643 DEBUG nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxo2fsxy3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='140b123c-947b-432d-ad6e-6ffa17bd6ac8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 23 11:02:10 compute-0 podman[210822]: 2026-02-23 11:02:10.85042529 +0000 UTC m=+0.053252693 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 11:02:11 compute-0 nova_compute[187639]: 2026-02-23 11:02:11.499 187643 DEBUG oslo_concurrency.processutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:02:11 compute-0 nova_compute[187639]: 2026-02-23 11:02:11.543 187643 DEBUG oslo_concurrency.processutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:02:11 compute-0 nova_compute[187639]: 2026-02-23 11:02:11.545 187643 DEBUG oslo_concurrency.processutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:02:11 compute-0 nova_compute[187639]: 2026-02-23 11:02:11.598 187643 DEBUG oslo_concurrency.processutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:02:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:12.644 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:12.644 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:12.645 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:14 compute-0 nova_compute[187639]: 2026-02-23 11:02:14.091 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:14 compute-0 sshd-session[210847]: Accepted publickey for nova from 192.168.122.101 port 39998 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 11:02:14 compute-0 systemd-logind[808]: New session 35 of user nova.
Feb 23 11:02:14 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 11:02:14 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 11:02:14 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 11:02:14 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 11:02:14 compute-0 systemd[210851]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:02:14 compute-0 systemd[210851]: Queued start job for default target Main User Target.
Feb 23 11:02:14 compute-0 systemd[210851]: Created slice User Application Slice.
Feb 23 11:02:14 compute-0 systemd[210851]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:02:14 compute-0 systemd[210851]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 11:02:14 compute-0 systemd[210851]: Reached target Paths.
Feb 23 11:02:14 compute-0 systemd[210851]: Reached target Timers.
Feb 23 11:02:14 compute-0 systemd[210851]: Starting D-Bus User Message Bus Socket...
Feb 23 11:02:14 compute-0 systemd[210851]: Starting Create User's Volatile Files and Directories...
Feb 23 11:02:14 compute-0 systemd[210851]: Finished Create User's Volatile Files and Directories.
Feb 23 11:02:14 compute-0 systemd[210851]: Listening on D-Bus User Message Bus Socket.
Feb 23 11:02:14 compute-0 systemd[210851]: Reached target Sockets.
Feb 23 11:02:14 compute-0 systemd[210851]: Reached target Basic System.
Feb 23 11:02:14 compute-0 systemd[210851]: Reached target Main User Target.
Feb 23 11:02:14 compute-0 systemd[210851]: Startup finished in 95ms.
Feb 23 11:02:14 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 11:02:14 compute-0 systemd[1]: Started Session 35 of User nova.
Feb 23 11:02:14 compute-0 sshd-session[210847]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:02:14 compute-0 sshd-session[210866]: Received disconnect from 192.168.122.101 port 39998:11: disconnected by user
Feb 23 11:02:14 compute-0 sshd-session[210866]: Disconnected from user nova 192.168.122.101 port 39998
Feb 23 11:02:14 compute-0 sshd-session[210847]: pam_unix(sshd:session): session closed for user nova
Feb 23 11:02:14 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Feb 23 11:02:14 compute-0 systemd-logind[808]: Session 35 logged out. Waiting for processes to exit.
Feb 23 11:02:14 compute-0 systemd-logind[808]: Removed session 35.
Feb 23 11:02:15 compute-0 podman[210868]: 2026-02-23 11:02:15.017897563 +0000 UTC m=+0.068616354 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 11:02:15 compute-0 nova_compute[187639]: 2026-02-23 11:02:15.774 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:16 compute-0 nova_compute[187639]: 2026-02-23 11:02:16.144 187643 DEBUG nova.compute.manager [req-323c9822-f9e9-489e-973f-b6ba930bda93 req-d5f93a0b-5d87-427c-9cfc-69fe48645978 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:16 compute-0 nova_compute[187639]: 2026-02-23 11:02:16.144 187643 DEBUG oslo_concurrency.lockutils [req-323c9822-f9e9-489e-973f-b6ba930bda93 req-d5f93a0b-5d87-427c-9cfc-69fe48645978 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:16 compute-0 nova_compute[187639]: 2026-02-23 11:02:16.144 187643 DEBUG oslo_concurrency.lockutils [req-323c9822-f9e9-489e-973f-b6ba930bda93 req-d5f93a0b-5d87-427c-9cfc-69fe48645978 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:16 compute-0 nova_compute[187639]: 2026-02-23 11:02:16.145 187643 DEBUG oslo_concurrency.lockutils [req-323c9822-f9e9-489e-973f-b6ba930bda93 req-d5f93a0b-5d87-427c-9cfc-69fe48645978 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:16 compute-0 nova_compute[187639]: 2026-02-23 11:02:16.145 187643 DEBUG nova.compute.manager [req-323c9822-f9e9-489e-973f-b6ba930bda93 req-d5f93a0b-5d87-427c-9cfc-69fe48645978 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:16 compute-0 nova_compute[187639]: 2026-02-23 11:02:16.145 187643 DEBUG nova.compute.manager [req-323c9822-f9e9-489e-973f-b6ba930bda93 req-d5f93a0b-5d87-427c-9cfc-69fe48645978 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.295 187643 INFO nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Took 5.70 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.295 187643 DEBUG nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.342 187643 DEBUG nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxo2fsxy3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='140b123c-947b-432d-ad6e-6ffa17bd6ac8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(099359bc-28d9-4d4a-9da4-dcecb9bff51a),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.370 187643 DEBUG nova.objects.instance [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid 140b123c-947b-432d-ad6e-6ffa17bd6ac8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.372 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.374 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.374 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.412 187643 DEBUG nova.virt.libvirt.vif [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1195638216',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1195638216',id=9,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:01:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9daefd52b5d441f4aa7111891776971e',ramdisk_id='',reservation_id='r-wig1m040',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:01:41Z,user_data=None,user_id='ce5433cd709d4cf09187dc41a0817d24',uuid=140b123c-947b-432d-ad6e-6ffa17bd6ac8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.412 187643 DEBUG nova.network.os_vif_util [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.413 187643 DEBUG nova.network.os_vif_util [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.413 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updating guest XML with vif config: <interface type="ethernet">
Feb 23 11:02:17 compute-0 nova_compute[187639]:   <mac address="fa:16:3e:d8:c0:d3"/>
Feb 23 11:02:17 compute-0 nova_compute[187639]:   <model type="virtio"/>
Feb 23 11:02:17 compute-0 nova_compute[187639]:   <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:02:17 compute-0 nova_compute[187639]:   <mtu size="1442"/>
Feb 23 11:02:17 compute-0 nova_compute[187639]:   <target dev="tap779b5d5e-1b"/>
Feb 23 11:02:17 compute-0 nova_compute[187639]: </interface>
Feb 23 11:02:17 compute-0 nova_compute[187639]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.414 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.877 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.878 187643 INFO nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 23 11:02:17 compute-0 nova_compute[187639]: 2026-02-23 11:02:17.973 187643 INFO nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.233 187643 DEBUG nova.compute.manager [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.233 187643 DEBUG oslo_concurrency.lockutils [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.234 187643 DEBUG oslo_concurrency.lockutils [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.234 187643 DEBUG oslo_concurrency.lockutils [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.234 187643 DEBUG nova.compute.manager [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.234 187643 WARNING nova.compute.manager [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received unexpected event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with vm_state active and task_state migrating.
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.234 187643 DEBUG nova.compute.manager [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-changed-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.235 187643 DEBUG nova.compute.manager [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Refreshing instance network info cache due to event network-changed-779b5d5e-1b21-416e-8082-24b46c4297d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.235 187643 DEBUG oslo_concurrency.lockutils [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.235 187643 DEBUG oslo_concurrency.lockutils [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.235 187643 DEBUG nova.network.neutron [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Refreshing network info cache for port 779b5d5e-1b21-416e-8082-24b46c4297d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.476 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.476 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.979 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:02:18 compute-0 nova_compute[187639]: 2026-02-23 11:02:18.979 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:02:19 compute-0 nova_compute[187639]: 2026-02-23 11:02:19.150 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:19 compute-0 nova_compute[187639]: 2026-02-23 11:02:19.482 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:02:19 compute-0 nova_compute[187639]: 2026-02-23 11:02:19.483 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:02:19 compute-0 podman[210902]: 2026-02-23 11:02:19.836857503 +0000 UTC m=+0.044699026 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 11:02:19 compute-0 nova_compute[187639]: 2026-02-23 11:02:19.986 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:02:19 compute-0 nova_compute[187639]: 2026-02-23 11:02:19.987 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.399 187643 DEBUG nova.network.neutron [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updated VIF entry in instance network info cache for port 779b5d5e-1b21-416e-8082-24b46c4297d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.400 187643 DEBUG nova.network.neutron [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Updating instance_info_cache with network_info: [{"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.490 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.491 187643 DEBUG nova.virt.libvirt.migration [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.500 187643 DEBUG oslo_concurrency.lockutils [req-c6e37820-cb26-4eb5-89f5-b7a763d5a589 req-c3e515fc-bd5b-43a4-8e3a-15810a1ffecf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-140b123c-947b-432d-ad6e-6ffa17bd6ac8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.576 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844540.576022, 140b123c-947b-432d-ad6e-6ffa17bd6ac8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.576 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] VM Paused (Lifecycle Event)
Feb 23 11:02:20 compute-0 ovn_controller[97601]: 2026-02-23T11:02:20Z|00074|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.653 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.657 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.678 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 23 11:02:20 compute-0 kernel: tap779b5d5e-1b (unregistering): left promiscuous mode
Feb 23 11:02:20 compute-0 NetworkManager[57207]: <info>  [1771844540.6951] device (tap779b5d5e-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:02:20 compute-0 ovn_controller[97601]: 2026-02-23T11:02:20Z|00075|binding|INFO|Releasing lport 779b5d5e-1b21-416e-8082-24b46c4297d8 from this chassis (sb_readonly=0)
Feb 23 11:02:20 compute-0 ovn_controller[97601]: 2026-02-23T11:02:20Z|00076|binding|INFO|Setting lport 779b5d5e-1b21-416e-8082-24b46c4297d8 down in Southbound
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.730 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:20 compute-0 ovn_controller[97601]: 2026-02-23T11:02:20Z|00077|binding|INFO|Removing iface tap779b5d5e-1b ovn-installed in OVS
Feb 23 11:02:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:20.736 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:c0:d3 10.100.0.11'], port_security=['fa:16:3e:d8:c0:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '48738a31-ba59-4fc8-acf1-d1f474e97648'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '140b123c-947b-432d-ad6e-6ffa17bd6ac8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9daefd52b5d441f4aa7111891776971e', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a18454e6-607d-414a-9c16-b6c32d0427cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a4f6995-4ff2-4f5f-8e19-26ce1a3fe182, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=779b5d5e-1b21-416e-8082-24b46c4297d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:02:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:20.737 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 779b5d5e-1b21-416e-8082-24b46c4297d8 in datapath ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1 unbound from our chassis
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.738 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:20.738 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:02:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:20.739 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3f17fdad-8387-4f42-8838-44251b27ccb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:20.740 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1 namespace which is not needed anymore
Feb 23 11:02:20 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Deactivated successfully.
Feb 23 11:02:20 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Consumed 13.392s CPU time.
Feb 23 11:02:20 compute-0 systemd-machined[156970]: Machine qemu-6-instance-00000009 terminated.
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.776 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:20 compute-0 neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1[210701]: [NOTICE]   (210705) : haproxy version is 2.8.14-c23fe91
Feb 23 11:02:20 compute-0 neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1[210701]: [NOTICE]   (210705) : path to executable is /usr/sbin/haproxy
Feb 23 11:02:20 compute-0 neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1[210701]: [WARNING]  (210705) : Exiting Master process...
Feb 23 11:02:20 compute-0 neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1[210701]: [ALERT]    (210705) : Current worker (210707) exited with code 143 (Terminated)
Feb 23 11:02:20 compute-0 neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1[210701]: [WARNING]  (210705) : All workers exited. Exiting... (0)
Feb 23 11:02:20 compute-0 systemd[1]: libpod-7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14.scope: Deactivated successfully.
Feb 23 11:02:20 compute-0 podman[210951]: 2026-02-23 11:02:20.886575455 +0000 UTC m=+0.055207183 container died 7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:02:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14-userdata-shm.mount: Deactivated successfully.
Feb 23 11:02:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-a84ff0255a2f21dcecf1a5fb917605fa6a9da3c032fc9f1930dd07ef3c744e8d-merged.mount: Deactivated successfully.
Feb 23 11:02:20 compute-0 podman[210951]: 2026-02-23 11:02:20.918541367 +0000 UTC m=+0.087173105 container cleanup 7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.921 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.922 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.923 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 23 11:02:20 compute-0 systemd[1]: libpod-conmon-7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14.scope: Deactivated successfully.
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.994 187643 DEBUG nova.virt.libvirt.guest [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '140b123c-947b-432d-ad6e-6ffa17bd6ac8' (instance-00000009) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.995 187643 INFO nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Migration operation has completed
Feb 23 11:02:20 compute-0 nova_compute[187639]: 2026-02-23 11:02:20.995 187643 INFO nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] _post_live_migration() is started..
Feb 23 11:02:20 compute-0 podman[211000]: 2026-02-23 11:02:20.996118118 +0000 UTC m=+0.052901665 container remove 7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.000 187643 DEBUG nova.compute.manager [req-ee9bee2d-bd12-4eb9-af6c-2d3050a4c80d req-86959d55-e087-477c-b3e0-1200366d6f71 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.000 187643 DEBUG oslo_concurrency.lockutils [req-ee9bee2d-bd12-4eb9-af6c-2d3050a4c80d req-86959d55-e087-477c-b3e0-1200366d6f71 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.001 187643 DEBUG oslo_concurrency.lockutils [req-ee9bee2d-bd12-4eb9-af6c-2d3050a4c80d req-86959d55-e087-477c-b3e0-1200366d6f71 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.001 187643 DEBUG oslo_concurrency.lockutils [req-ee9bee2d-bd12-4eb9-af6c-2d3050a4c80d req-86959d55-e087-477c-b3e0-1200366d6f71 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.001 187643 DEBUG nova.compute.manager [req-ee9bee2d-bd12-4eb9-af6c-2d3050a4c80d req-86959d55-e087-477c-b3e0-1200366d6f71 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.001 187643 DEBUG nova.compute.manager [req-ee9bee2d-bd12-4eb9-af6c-2d3050a4c80d req-86959d55-e087-477c-b3e0-1200366d6f71 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.001 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0120e472-831c-4a5b-b6a3-31fd59be39ef]: (4, ('Mon Feb 23 11:02:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1 (7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14)\n7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14\nMon Feb 23 11:02:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1 (7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14)\n7dbf83848fe4e849081ce78ff42a66c7d9a0587e19227ab4400943ed379a9e14\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.003 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[22f9c0fb-0876-4a8d-848c-7b109176bbf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.004 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba01db54-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:02:21 compute-0 kernel: tapba01db54-f0: left promiscuous mode
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.005 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.016 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.018 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[72255b31-304f-4054-878e-84c193fdd52d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.041 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[c42e9ef8-cd09-4412-944c-4f187a69853d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.042 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b0c580-816c-4e99-8c6d-0f0683f693fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.052 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[10d7ee0d-78e5-4854-8b0d-935df906db62]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364445, 'reachable_time': 25921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211020, 'error': None, 'target': 'ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:21 compute-0 systemd[1]: run-netns-ovnmeta\x2dba01db54\x2dffea\x2d4bcc\x2d9cf0\x2de77b68ebf7d1.mount: Deactivated successfully.
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.059 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:02:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:02:21.059 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[d2069659-6cc1-4f07-818a-6c2b2ffa984a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.817 187643 DEBUG nova.network.neutron [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Activated binding for port 779b5d5e-1b21-416e-8082-24b46c4297d8 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.818 187643 DEBUG nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.819 187643 DEBUG nova.virt.libvirt.vif [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1195638216',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1195638216',id=9,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:01:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9daefd52b5d441f4aa7111891776971e',ramdisk_id='',reservation_id='r-wig1m040',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1409397465-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:02:09Z,user_data=None,user_id='ce5433cd709d4cf09187dc41a0817d24',uuid=140b123c-947b-432d-ad6e-6ffa17bd6ac8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.819 187643 DEBUG nova.network.os_vif_util [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "779b5d5e-1b21-416e-8082-24b46c4297d8", "address": "fa:16:3e:d8:c0:d3", "network": {"id": "ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062271977-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9daefd52b5d441f4aa7111891776971e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap779b5d5e-1b", "ovs_interfaceid": "779b5d5e-1b21-416e-8082-24b46c4297d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.820 187643 DEBUG nova.network.os_vif_util [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.820 187643 DEBUG os_vif [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.821 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.822 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap779b5d5e-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.885 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.887 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.889 187643 INFO os_vif [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=779b5d5e-1b21-416e-8082-24b46c4297d8,network=Network(ba01db54-ffea-4bcc-9cf0-e77b68ebf7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap779b5d5e-1b')
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.890 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.890 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.890 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.891 187643 DEBUG nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.891 187643 INFO nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Deleting instance files /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8_del
Feb 23 11:02:21 compute-0 nova_compute[187639]: 2026-02-23 11:02:21.892 187643 INFO nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Deletion of /var/lib/nova/instances/140b123c-947b-432d-ad6e-6ffa17bd6ac8_del complete
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.166 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.167 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.167 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.167 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.167 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.168 187643 WARNING nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received unexpected event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with vm_state active and task_state migrating.
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.168 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.168 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.168 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.168 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.169 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.169 187643 WARNING nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received unexpected event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with vm_state active and task_state migrating.
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.169 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.169 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.169 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.169 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.170 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.170 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-unplugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.170 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.170 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.170 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.170 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.171 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.171 187643 WARNING nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received unexpected event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with vm_state active and task_state migrating.
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.171 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.171 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.171 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.171 187643 DEBUG oslo_concurrency.lockutils [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.172 187643 DEBUG nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] No waiting events found dispatching network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:02:23 compute-0 nova_compute[187639]: 2026-02-23 11:02:23.172 187643 WARNING nova.compute.manager [req-1a07da0b-245a-4088-abd5-af3741b85c96 req-2bd80daa-6c08-4ee9-a9fb-4a8a50a9e6e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Received unexpected event network-vif-plugged-779b5d5e-1b21-416e-8082-24b46c4297d8 for instance with vm_state active and task_state migrating.
Feb 23 11:02:25 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 11:02:25 compute-0 systemd[210851]: Activating special unit Exit the Session...
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped target Main User Target.
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped target Basic System.
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped target Paths.
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped target Sockets.
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped target Timers.
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 11:02:25 compute-0 systemd[210851]: Closed D-Bus User Message Bus Socket.
Feb 23 11:02:25 compute-0 systemd[210851]: Stopped Create User's Volatile Files and Directories.
Feb 23 11:02:25 compute-0 systemd[210851]: Removed slice User Application Slice.
Feb 23 11:02:25 compute-0 systemd[210851]: Reached target Shutdown.
Feb 23 11:02:25 compute-0 systemd[210851]: Finished Exit the Session.
Feb 23 11:02:25 compute-0 systemd[210851]: Reached target Exit the Session.
Feb 23 11:02:25 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 11:02:25 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 11:02:25 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 11:02:25 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 11:02:25 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 11:02:25 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 11:02:25 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 11:02:25 compute-0 nova_compute[187639]: 2026-02-23 11:02:25.777 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.886 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.941 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.941 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.941 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "140b123c-947b-432d-ad6e-6ffa17bd6ac8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.969 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.970 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.970 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:26 compute-0 nova_compute[187639]: 2026-02-23 11:02:26.971 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.125 187643 WARNING nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.127 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5796MB free_disk=73.20605087280273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.127 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.127 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.174 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration for instance 140b123c-947b-432d-ad6e-6ffa17bd6ac8 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.199 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.246 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration 099359bc-28d9-4d4a-9da4-dcecb9bff51a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.246 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.247 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.295 187643 DEBUG nova.compute.provider_tree [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.311 187643 DEBUG nova.scheduler.client.report [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.333 187643 DEBUG nova.compute.resource_tracker [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.333 187643 DEBUG oslo_concurrency.lockutils [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.339 187643 INFO nova.compute.manager [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.434 187643 INFO nova.scheduler.client.report [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Deleted allocation for migration 099359bc-28d9-4d4a-9da4-dcecb9bff51a
Feb 23 11:02:27 compute-0 nova_compute[187639]: 2026-02-23 11:02:27.434 187643 DEBUG nova.virt.libvirt.driver [None req-b9cec9c5-9f40-4b9b-9f85-9a3f2323333e a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 23 11:02:29 compute-0 podman[197002]: time="2026-02-23T11:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:02:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:02:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 23 11:02:30 compute-0 nova_compute[187639]: 2026-02-23 11:02:30.781 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:31 compute-0 openstack_network_exporter[199919]: ERROR   11:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:02:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:02:31 compute-0 openstack_network_exporter[199919]: ERROR   11:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:02:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:02:31 compute-0 nova_compute[187639]: 2026-02-23 11:02:31.928 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:32 compute-0 podman[211024]: 2026-02-23 11:02:32.866709167 +0000 UTC m=+0.062954329 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:02:33 compute-0 sshd-session[211048]: Invalid user admin from 143.198.30.3 port 43362
Feb 23 11:02:33 compute-0 sshd-session[211048]: Connection closed by invalid user admin 143.198.30.3 port 43362 [preauth]
Feb 23 11:02:35 compute-0 nova_compute[187639]: 2026-02-23 11:02:35.783 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:35 compute-0 nova_compute[187639]: 2026-02-23 11:02:35.919 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844540.917898, 140b123c-947b-432d-ad6e-6ffa17bd6ac8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:02:35 compute-0 nova_compute[187639]: 2026-02-23 11:02:35.919 187643 INFO nova.compute.manager [-] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] VM Stopped (Lifecycle Event)
Feb 23 11:02:35 compute-0 nova_compute[187639]: 2026-02-23 11:02:35.941 187643 DEBUG nova.compute.manager [None req-1dbe05ec-b9c9-4964-8563-82ccf5289f78 - - - - - -] [instance: 140b123c-947b-432d-ad6e-6ffa17bd6ac8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:02:36 compute-0 nova_compute[187639]: 2026-02-23 11:02:36.930 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:40 compute-0 nova_compute[187639]: 2026-02-23 11:02:40.785 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:41 compute-0 podman[211050]: 2026-02-23 11:02:41.857886101 +0000 UTC m=+0.058084326 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 11:02:41 compute-0 nova_compute[187639]: 2026-02-23 11:02:41.972 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:43 compute-0 sshd-session[211069]: Connection closed by authenticating user root 165.227.79.48 port 52080 [preauth]
Feb 23 11:02:45 compute-0 nova_compute[187639]: 2026-02-23 11:02:45.787 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:45 compute-0 podman[211071]: 2026-02-23 11:02:45.878335318 +0000 UTC m=+0.081031358 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:02:46 compute-0 nova_compute[187639]: 2026-02-23 11:02:46.974 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:50 compute-0 nova_compute[187639]: 2026-02-23 11:02:50.789 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:50 compute-0 podman[211098]: 2026-02-23 11:02:50.843189174 +0000 UTC m=+0.051088090 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 11:02:52 compute-0 nova_compute[187639]: 2026-02-23 11:02:52.020 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:52 compute-0 nova_compute[187639]: 2026-02-23 11:02:52.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:52 compute-0 nova_compute[187639]: 2026-02-23 11:02:52.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 11:02:52 compute-0 nova_compute[187639]: 2026-02-23 11:02:52.716 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 11:02:53 compute-0 nova_compute[187639]: 2026-02-23 11:02:53.714 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:54 compute-0 nova_compute[187639]: 2026-02-23 11:02:54.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:55 compute-0 nova_compute[187639]: 2026-02-23 11:02:55.792 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:56 compute-0 nova_compute[187639]: 2026-02-23 11:02:56.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.023 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.716 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.717 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:57 compute-0 nova_compute[187639]: 2026-02-23 11:02:57.717 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:02:58 compute-0 nova_compute[187639]: 2026-02-23 11:02:58.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:02:59 compute-0 podman[197002]: time="2026-02-23T11:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:02:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:02:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.742 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.742 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.742 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.742 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.791 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.895 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.896 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5812MB free_disk=73.20596694946289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.897 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:03:00 compute-0 nova_compute[187639]: 2026-02-23 11:03:00.897 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.011 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.011 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.073 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.098 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.101 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.101 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.102 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.102 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 11:03:01 compute-0 nova_compute[187639]: 2026-02-23 11:03:01.131 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:01 compute-0 openstack_network_exporter[199919]: ERROR   11:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:03:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:03:01 compute-0 openstack_network_exporter[199919]: ERROR   11:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:03:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:03:02 compute-0 nova_compute[187639]: 2026-02-23 11:03:02.057 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:02 compute-0 nova_compute[187639]: 2026-02-23 11:03:02.147 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:03 compute-0 podman[211119]: 2026-02-23 11:03:03.886486198 +0000 UTC m=+0.087843123 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:03:04 compute-0 sshd-session[211143]: Invalid user admin from 143.198.30.3 port 45440
Feb 23 11:03:04 compute-0 sshd-session[211143]: Connection closed by invalid user admin 143.198.30.3 port 45440 [preauth]
Feb 23 11:03:05 compute-0 nova_compute[187639]: 2026-02-23 11:03:05.826 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:07 compute-0 nova_compute[187639]: 2026-02-23 11:03:07.060 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:08 compute-0 ovn_controller[97601]: 2026-02-23T11:03:08Z|00078|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 23 11:03:10 compute-0 nova_compute[187639]: 2026-02-23 11:03:10.828 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:11 compute-0 nova_compute[187639]: 2026-02-23 11:03:11.821 187643 DEBUG nova.compute.manager [None req-5d94b975-9831-43ae-9ea2-84a66a76a325 d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 23 11:03:11 compute-0 nova_compute[187639]: 2026-02-23 11:03:11.890 187643 DEBUG nova.compute.provider_tree [None req-5d94b975-9831-43ae-9ea2-84a66a76a325 d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 14 to 17 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 11:03:12 compute-0 nova_compute[187639]: 2026-02-23 11:03:12.100 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:03:12.645 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:03:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:03:12.645 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:03:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:03:12.645 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:03:12 compute-0 podman[211145]: 2026-02-23 11:03:12.849092185 +0000 UTC m=+0.052618520 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 11:03:14 compute-0 nova_compute[187639]: 2026-02-23 11:03:14.899 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:03:14.899 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:03:14 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:03:14.901 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:03:15 compute-0 nova_compute[187639]: 2026-02-23 11:03:15.828 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:16 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:03:16.903 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:03:16 compute-0 podman[211165]: 2026-02-23 11:03:16.925346741 +0000 UTC m=+0.119023921 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 23 11:03:17 compute-0 nova_compute[187639]: 2026-02-23 11:03:17.101 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:19 compute-0 nova_compute[187639]: 2026-02-23 11:03:19.102 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:20 compute-0 nova_compute[187639]: 2026-02-23 11:03:20.830 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:21 compute-0 podman[211191]: 2026-02-23 11:03:21.842721587 +0000 UTC m=+0.050548827 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 11:03:22 compute-0 nova_compute[187639]: 2026-02-23 11:03:22.143 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:25 compute-0 nova_compute[187639]: 2026-02-23 11:03:25.833 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:27 compute-0 nova_compute[187639]: 2026-02-23 11:03:27.146 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:28 compute-0 sshd-session[211212]: Connection closed by authenticating user root 165.227.79.48 port 52808 [preauth]
Feb 23 11:03:29 compute-0 podman[197002]: time="2026-02-23T11:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:03:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:03:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 23 11:03:30 compute-0 nova_compute[187639]: 2026-02-23 11:03:30.879 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:31 compute-0 openstack_network_exporter[199919]: ERROR   11:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:03:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:03:31 compute-0 openstack_network_exporter[199919]: ERROR   11:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:03:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:03:32 compute-0 nova_compute[187639]: 2026-02-23 11:03:32.202 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:34 compute-0 podman[211214]: 2026-02-23 11:03:34.838617677 +0000 UTC m=+0.046283477 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 11:03:35 compute-0 nova_compute[187639]: 2026-02-23 11:03:35.881 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:36 compute-0 sshd-session[211238]: Invalid user admin from 143.198.30.3 port 46038
Feb 23 11:03:36 compute-0 sshd-session[211238]: Connection closed by invalid user admin 143.198.30.3 port 46038 [preauth]
Feb 23 11:03:37 compute-0 nova_compute[187639]: 2026-02-23 11:03:37.205 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:40 compute-0 nova_compute[187639]: 2026-02-23 11:03:40.882 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:42 compute-0 nova_compute[187639]: 2026-02-23 11:03:42.241 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:43 compute-0 podman[211240]: 2026-02-23 11:03:43.865393857 +0000 UTC m=+0.059511186 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:03:45 compute-0 nova_compute[187639]: 2026-02-23 11:03:45.920 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:47 compute-0 nova_compute[187639]: 2026-02-23 11:03:47.263 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:47 compute-0 podman[211259]: 2026-02-23 11:03:47.863336286 +0000 UTC m=+0.068819124 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 11:03:50 compute-0 nova_compute[187639]: 2026-02-23 11:03:50.921 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:52 compute-0 nova_compute[187639]: 2026-02-23 11:03:52.266 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:52 compute-0 podman[211286]: 2026-02-23 11:03:52.844306443 +0000 UTC m=+0.049121910 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 23 11:03:55 compute-0 ovn_controller[97601]: 2026-02-23T11:03:55Z|00079|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 11:03:55 compute-0 nova_compute[187639]: 2026-02-23 11:03:55.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:55 compute-0 nova_compute[187639]: 2026-02-23 11:03:55.963 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:56 compute-0 nova_compute[187639]: 2026-02-23 11:03:56.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:57 compute-0 nova_compute[187639]: 2026-02-23 11:03:57.269 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:03:57 compute-0 nova_compute[187639]: 2026-02-23 11:03:57.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:57 compute-0 nova_compute[187639]: 2026-02-23 11:03:57.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:03:57 compute-0 nova_compute[187639]: 2026-02-23 11:03:57.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:03:57 compute-0 nova_compute[187639]: 2026-02-23 11:03:57.712 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:03:57 compute-0 nova_compute[187639]: 2026-02-23 11:03:57.713 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:57 compute-0 nova_compute[187639]: 2026-02-23 11:03:57.713 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:03:58 compute-0 nova_compute[187639]: 2026-02-23 11:03:58.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:58 compute-0 nova_compute[187639]: 2026-02-23 11:03:58.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:58 compute-0 nova_compute[187639]: 2026-02-23 11:03:58.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:03:59 compute-0 podman[197002]: time="2026-02-23T11:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:03:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:03:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.733 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.734 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.734 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.735 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.880 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.881 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5822MB free_disk=73.2059555053711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.881 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.882 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.965 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.966 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:04:00 compute-0 nova_compute[187639]: 2026-02-23 11:04:00.968 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:01 compute-0 nova_compute[187639]: 2026-02-23 11:04:01.115 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:04:01 compute-0 nova_compute[187639]: 2026-02-23 11:04:01.126 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:04:01 compute-0 nova_compute[187639]: 2026-02-23 11:04:01.127 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:04:01 compute-0 nova_compute[187639]: 2026-02-23 11:04:01.127 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:01 compute-0 openstack_network_exporter[199919]: ERROR   11:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:04:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:04:01 compute-0 openstack_network_exporter[199919]: ERROR   11:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:04:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:04:02 compute-0 nova_compute[187639]: 2026-02-23 11:04:02.271 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:04 compute-0 nova_compute[187639]: 2026-02-23 11:04:04.128 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:04 compute-0 nova_compute[187639]: 2026-02-23 11:04:04.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:05 compute-0 podman[211308]: 2026-02-23 11:04:05.840422939 +0000 UTC m=+0.048930535 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:04:06 compute-0 nova_compute[187639]: 2026-02-23 11:04:06.027 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:07 compute-0 nova_compute[187639]: 2026-02-23 11:04:07.313 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:07 compute-0 sshd-session[211333]: Invalid user admin from 143.198.30.3 port 53542
Feb 23 11:04:07 compute-0 sshd-session[211333]: Connection closed by invalid user admin 143.198.30.3 port 53542 [preauth]
Feb 23 11:04:11 compute-0 nova_compute[187639]: 2026-02-23 11:04:11.030 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:12 compute-0 nova_compute[187639]: 2026-02-23 11:04:12.317 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:12.646 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:12.648 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:12.648 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:14 compute-0 podman[211335]: 2026-02-23 11:04:14.832191715 +0000 UTC m=+0.040135249 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 11:04:15 compute-0 sshd-session[211355]: Connection closed by authenticating user root 165.227.79.48 port 59362 [preauth]
Feb 23 11:04:16 compute-0 nova_compute[187639]: 2026-02-23 11:04:16.070 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:17.258 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:04:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:17.259 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:04:17 compute-0 nova_compute[187639]: 2026-02-23 11:04:17.277 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:17 compute-0 nova_compute[187639]: 2026-02-23 11:04:17.318 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:18 compute-0 podman[211357]: 2026-02-23 11:04:18.870410544 +0000 UTC m=+0.075230239 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 11:04:21 compute-0 nova_compute[187639]: 2026-02-23 11:04:21.072 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:22 compute-0 nova_compute[187639]: 2026-02-23 11:04:22.320 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:23 compute-0 podman[211383]: 2026-02-23 11:04:23.844926945 +0000 UTC m=+0.052866036 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vcs-type=git, release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Feb 23 11:04:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:24.262 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:04:26 compute-0 nova_compute[187639]: 2026-02-23 11:04:26.074 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:27 compute-0 nova_compute[187639]: 2026-02-23 11:04:27.324 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:29 compute-0 podman[197002]: time="2026-02-23T11:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:04:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:04:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Feb 23 11:04:31 compute-0 nova_compute[187639]: 2026-02-23 11:04:31.075 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:31 compute-0 openstack_network_exporter[199919]: ERROR   11:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:04:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:04:31 compute-0 openstack_network_exporter[199919]: ERROR   11:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:04:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:04:32 compute-0 nova_compute[187639]: 2026-02-23 11:04:32.326 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:36 compute-0 nova_compute[187639]: 2026-02-23 11:04:36.078 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:36 compute-0 podman[211405]: 2026-02-23 11:04:36.841631577 +0000 UTC m=+0.043319061 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:04:37 compute-0 nova_compute[187639]: 2026-02-23 11:04:37.329 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:38 compute-0 sshd-session[211429]: Invalid user admin from 143.198.30.3 port 42120
Feb 23 11:04:38 compute-0 sshd-session[211429]: Connection closed by invalid user admin 143.198.30.3 port 42120 [preauth]
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.257 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.258 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.273 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.353 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.354 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.361 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.361 187643 INFO nova.compute.claims [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.508 187643 DEBUG nova.compute.provider_tree [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.522 187643 DEBUG nova.scheduler.client.report [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.549 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.550 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.610 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.611 187643 DEBUG nova.network.neutron [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.650 187643 INFO nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.673 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.812 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.813 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.814 187643 INFO nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Creating image(s)
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.814 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "/var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.815 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.816 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.837 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.885 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.886 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.887 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.904 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.927 187643 DEBUG nova.policy [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48814d91aad6418f9d55fc9967ed0087', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.980 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:04:40 compute-0 nova_compute[187639]: 2026-02-23 11:04:40.980 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.008 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.009 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.009 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.065 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.066 187643 DEBUG nova.virt.disk.api [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Checking if we can resize image /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.066 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.079 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.126 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.126 187643 DEBUG nova.virt.disk.api [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Cannot resize image /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.127 187643 DEBUG nova.objects.instance [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'migration_context' on Instance uuid d977d6e0-416b-4f8f-a035-224ae3b856f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.145 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.145 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Ensure instance console log exists: /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.146 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.146 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.146 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:41 compute-0 nova_compute[187639]: 2026-02-23 11:04:41.471 187643 DEBUG nova.network.neutron [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Successfully created port: 6f6eec7d-316d-4eca-bab2-007acd8bc545 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.080 187643 DEBUG nova.network.neutron [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Successfully updated port: 6f6eec7d-316d-4eca-bab2-007acd8bc545 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.142 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.142 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquired lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.143 187643 DEBUG nova.network.neutron [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.202 187643 DEBUG nova.compute.manager [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received event network-changed-6f6eec7d-316d-4eca-bab2-007acd8bc545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.202 187643 DEBUG nova.compute.manager [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Refreshing instance network info cache due to event network-changed-6f6eec7d-316d-4eca-bab2-007acd8bc545. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.203 187643 DEBUG oslo_concurrency.lockutils [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.287 187643 DEBUG nova.network.neutron [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.373 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.906 187643 DEBUG nova.network.neutron [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Updating instance_info_cache with network_info: [{"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.923 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Releasing lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.923 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Instance network_info: |[{"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.924 187643 DEBUG oslo_concurrency.lockutils [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.924 187643 DEBUG nova.network.neutron [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Refreshing network info cache for port 6f6eec7d-316d-4eca-bab2-007acd8bc545 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.929 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Start _get_guest_xml network_info=[{"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.935 187643 WARNING nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.941 187643 DEBUG nova.virt.libvirt.host [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.941 187643 DEBUG nova.virt.libvirt.host [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.952 187643 DEBUG nova.virt.libvirt.host [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.953 187643 DEBUG nova.virt.libvirt.host [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.955 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.955 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.956 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.957 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.957 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.958 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.958 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.958 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.959 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.959 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.960 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.960 187643 DEBUG nova.virt.hardware [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.966 187643 DEBUG nova.virt.libvirt.vif [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1024966650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1024966650',id=12,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-euhuqnsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:04:40Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=d977d6e0-416b-4f8f-a035-224ae3b856f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.967 187643 DEBUG nova.network.os_vif_util [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.968 187643 DEBUG nova.network.os_vif_util [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3c:12,bridge_name='br-int',has_traffic_filtering=True,id=6f6eec7d-316d-4eca-bab2-007acd8bc545,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6eec7d-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.969 187643 DEBUG nova.objects.instance [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'pci_devices' on Instance uuid d977d6e0-416b-4f8f-a035-224ae3b856f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.990 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <uuid>d977d6e0-416b-4f8f-a035-224ae3b856f8</uuid>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <name>instance-0000000c</name>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteStrategies-server-1024966650</nova:name>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:04:42</nova:creationTime>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:user uuid="48814d91aad6418f9d55fc9967ed0087">tempest-TestExecuteStrategies-126537390-project-member</nova:user>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:project uuid="5dfbb0ac693b4065ada17052ebb303dd">tempest-TestExecuteStrategies-126537390</nova:project>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         <nova:port uuid="6f6eec7d-316d-4eca-bab2-007acd8bc545">
Feb 23 11:04:42 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <system>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <entry name="serial">d977d6e0-416b-4f8f-a035-224ae3b856f8</entry>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <entry name="uuid">d977d6e0-416b-4f8f-a035-224ae3b856f8</entry>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </system>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <os>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   </os>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <features>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   </features>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk.config"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:c9:3c:12"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <target dev="tap6f6eec7d-31"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/console.log" append="off"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <video>
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </video>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:04:42 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:04:42 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:04:42 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:04:42 compute-0 nova_compute[187639]: </domain>
Feb 23 11:04:42 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.991 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Preparing to wait for external event network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.991 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.991 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.991 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.992 187643 DEBUG nova.virt.libvirt.vif [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1024966650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1024966650',id=12,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-euhuqnsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:04:40Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=d977d6e0-416b-4f8f-a035-224ae3b856f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.992 187643 DEBUG nova.network.os_vif_util [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.993 187643 DEBUG nova.network.os_vif_util [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3c:12,bridge_name='br-int',has_traffic_filtering=True,id=6f6eec7d-316d-4eca-bab2-007acd8bc545,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6eec7d-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.993 187643 DEBUG os_vif [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3c:12,bridge_name='br-int',has_traffic_filtering=True,id=6f6eec7d-316d-4eca-bab2-007acd8bc545,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6eec7d-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.994 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.994 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.994 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.996 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.997 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f6eec7d-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.997 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6f6eec7d-31, col_values=(('external_ids', {'iface-id': '6f6eec7d-316d-4eca-bab2-007acd8bc545', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:3c:12', 'vm-uuid': 'd977d6e0-416b-4f8f-a035-224ae3b856f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:04:42 compute-0 nova_compute[187639]: 2026-02-23 11:04:42.999 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:43 compute-0 NetworkManager[57207]: <info>  [1771844683.0002] manager: (tap6f6eec7d-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Feb 23 11:04:43 compute-0 nova_compute[187639]: 2026-02-23 11:04:43.001 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:04:43 compute-0 nova_compute[187639]: 2026-02-23 11:04:43.003 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:43 compute-0 nova_compute[187639]: 2026-02-23 11:04:43.004 187643 INFO os_vif [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:3c:12,bridge_name='br-int',has_traffic_filtering=True,id=6f6eec7d-316d-4eca-bab2-007acd8bc545,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6eec7d-31')
Feb 23 11:04:43 compute-0 nova_compute[187639]: 2026-02-23 11:04:43.069 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:04:43 compute-0 nova_compute[187639]: 2026-02-23 11:04:43.070 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:04:43 compute-0 nova_compute[187639]: 2026-02-23 11:04:43.070 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No VIF found with MAC fa:16:3e:c9:3c:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:04:43 compute-0 nova_compute[187639]: 2026-02-23 11:04:43.070 187643 INFO nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Using config drive
Feb 23 11:04:44 compute-0 nova_compute[187639]: 2026-02-23 11:04:44.929 187643 INFO nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Creating config drive at /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk.config
Feb 23 11:04:44 compute-0 nova_compute[187639]: 2026-02-23 11:04:44.935 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp1x1t_r5_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.060 187643 DEBUG oslo_concurrency.processutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp1x1t_r5_" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:04:45 compute-0 kernel: tap6f6eec7d-31: entered promiscuous mode
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.128 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 NetworkManager[57207]: <info>  [1771844685.1307] manager: (tap6f6eec7d-31): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.132 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 ovn_controller[97601]: 2026-02-23T11:04:45Z|00080|binding|INFO|Claiming lport 6f6eec7d-316d-4eca-bab2-007acd8bc545 for this chassis.
Feb 23 11:04:45 compute-0 ovn_controller[97601]: 2026-02-23T11:04:45Z|00081|binding|INFO|6f6eec7d-316d-4eca-bab2-007acd8bc545: Claiming fa:16:3e:c9:3c:12 10.100.0.14
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.135 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.139 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.150 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:3c:12 10.100.0.14'], port_security=['fa:16:3e:c9:3c:12 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd977d6e0-416b-4f8f-a035-224ae3b856f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=6f6eec7d-316d-4eca-bab2-007acd8bc545) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.152 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 6f6eec7d-316d-4eca-bab2-007acd8bc545 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.155 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:04:45 compute-0 systemd-machined[156970]: New machine qemu-7-instance-0000000c.
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.168 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e2094610-72ba-4929-85f7-fe8a9073e061]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.169 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b12da8d-31 in ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.171 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b12da8d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.171 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[da3c1abd-3ff6-46b8-8a39-7b0481954783]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.173 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf8dbde-ca21-4ce4-bfb2-2385207f6d17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000c.
Feb 23 11:04:45 compute-0 ovn_controller[97601]: 2026-02-23T11:04:45Z|00082|binding|INFO|Setting lport 6f6eec7d-316d-4eca-bab2-007acd8bc545 ovn-installed in OVS
Feb 23 11:04:45 compute-0 ovn_controller[97601]: 2026-02-23T11:04:45Z|00083|binding|INFO|Setting lport 6f6eec7d-316d-4eca-bab2-007acd8bc545 up in Southbound
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.176 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 systemd-udevd[211487]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.187 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[42af3736-0948-4f58-a8ee-d973c1c17cbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 podman[211458]: 2026-02-23 11:04:45.194905534 +0000 UTC m=+0.066418413 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:04:45 compute-0 NetworkManager[57207]: <info>  [1771844685.1962] device (tap6f6eec7d-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:04:45 compute-0 NetworkManager[57207]: <info>  [1771844685.1972] device (tap6f6eec7d-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.199 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[328c88a9-a1e5-4710-8444-15970577df36]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.229 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[988793e7-b207-41cb-a123-cc3a863d9941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.233 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[792b762c-91c3-4f67-bbff-5f015eb337e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 NetworkManager[57207]: <info>  [1771844685.2345] manager: (tap4b12da8d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.263 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ec1397-db36-48e7-8797-1b0533a4621d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.266 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[72ef508a-6528-42ff-9749-d506668caa79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 NetworkManager[57207]: <info>  [1771844685.2835] device (tap4b12da8d-30): carrier: link connected
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.286 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[5df504b3-602d-46ec-87d6-c38a337d6ad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.303 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2213d458-c165-4f74-bb3b-e1497f0878c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382945, 'reachable_time': 32492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211519, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.317 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d792c16d-4b84-4bd5-a6bd-9421f4dfb386]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8a8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382945, 'tstamp': 382945}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211520, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.335 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[221200b4-cc41-4b8b-aeaa-276a26ecbd1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382945, 'reachable_time': 32492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211521, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.363 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1680d40d-0a28-4486-a3b3-998fcdf25e20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.409 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[c505cd6d-0d48-4ccd-ace4-5b03535bef47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.410 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.411 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.411 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.412 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 kernel: tap4b12da8d-30: entered promiscuous mode
Feb 23 11:04:45 compute-0 NetworkManager[57207]: <info>  [1771844685.4146] manager: (tap4b12da8d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.414 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.415 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.416 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 ovn_controller[97601]: 2026-02-23T11:04:45Z|00084|binding|INFO|Releasing lport 586378da-906d-4768-bab7-0954450c4a57 from this chassis (sb_readonly=0)
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.417 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.419 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.419 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[eca556e0-d283-468f-b8b2-08c86c391b2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.421 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.421 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:04:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:04:45.421 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'env', 'PROCESS_TAG=haproxy-4b12da8d-3150-4d44-b948-8d49ddadedef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b12da8d-3150-4d44-b948-8d49ddadedef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:04:45 compute-0 podman[211553]: 2026-02-23 11:04:45.710255814 +0000 UTC m=+0.040611582 container create 1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 11:04:45 compute-0 systemd[1]: Started libpod-conmon-1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513.scope.
Feb 23 11:04:45 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:04:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d759770d93fe9b7128bd1e1296552a87b4cea0b1d69289350bde6b6d48540ffc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:04:45 compute-0 podman[211553]: 2026-02-23 11:04:45.772770017 +0000 UTC m=+0.103125805 container init 1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 11:04:45 compute-0 podman[211553]: 2026-02-23 11:04:45.77720171 +0000 UTC m=+0.107557468 container start 1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:04:45 compute-0 podman[211553]: 2026-02-23 11:04:45.687723186 +0000 UTC m=+0.018078994 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:04:45 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[211568]: [NOTICE]   (211578) : New worker (211580) forked
Feb 23 11:04:45 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[211568]: [NOTICE]   (211578) : Loading success.
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.840 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844685.8392844, d977d6e0-416b-4f8f-a035-224ae3b856f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.841 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] VM Started (Lifecycle Event)
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.866 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.868 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844685.8399143, d977d6e0-416b-4f8f-a035-224ae3b856f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.868 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] VM Paused (Lifecycle Event)
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.884 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.887 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:04:45 compute-0 nova_compute[187639]: 2026-02-23 11:04:45.910 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.080 187643 DEBUG nova.compute.manager [req-0b5b5d2f-5a37-4327-b6da-84381fbc7ea2 req-b449e873-2dd4-4b6b-8a72-2766ccea0eec 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received event network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.081 187643 DEBUG oslo_concurrency.lockutils [req-0b5b5d2f-5a37-4327-b6da-84381fbc7ea2 req-b449e873-2dd4-4b6b-8a72-2766ccea0eec 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.081 187643 DEBUG oslo_concurrency.lockutils [req-0b5b5d2f-5a37-4327-b6da-84381fbc7ea2 req-b449e873-2dd4-4b6b-8a72-2766ccea0eec 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.082 187643 DEBUG oslo_concurrency.lockutils [req-0b5b5d2f-5a37-4327-b6da-84381fbc7ea2 req-b449e873-2dd4-4b6b-8a72-2766ccea0eec 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.083 187643 DEBUG nova.compute.manager [req-0b5b5d2f-5a37-4327-b6da-84381fbc7ea2 req-b449e873-2dd4-4b6b-8a72-2766ccea0eec 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Processing event network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.084 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.088 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844686.088559, d977d6e0-416b-4f8f-a035-224ae3b856f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.089 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] VM Resumed (Lifecycle Event)
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.090 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.094 187643 INFO nova.virt.libvirt.driver [-] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Instance spawned successfully.
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.094 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.113 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.117 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.120 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.120 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.121 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.121 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.121 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.122 187643 DEBUG nova.virt.libvirt.driver [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.125 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.143 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.177 187643 INFO nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Took 5.37 seconds to spawn the instance on the hypervisor.
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.178 187643 DEBUG nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.233 187643 INFO nova.compute.manager [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Took 5.91 seconds to build instance.
Feb 23 11:04:46 compute-0 nova_compute[187639]: 2026-02-23 11:04:46.253 187643 DEBUG oslo_concurrency.lockutils [None req-3bee543b-c099-4525-b1a5-c83c8c336014 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:47 compute-0 nova_compute[187639]: 2026-02-23 11:04:47.170 187643 DEBUG nova.network.neutron [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Updated VIF entry in instance network info cache for port 6f6eec7d-316d-4eca-bab2-007acd8bc545. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:04:47 compute-0 nova_compute[187639]: 2026-02-23 11:04:47.171 187643 DEBUG nova.network.neutron [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Updating instance_info_cache with network_info: [{"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:04:47 compute-0 nova_compute[187639]: 2026-02-23 11:04:47.184 187643 DEBUG oslo_concurrency.lockutils [req-e687498a-afde-426c-89e6-8cebbbf805d2 req-466ba7bd-a5cf-41ed-996f-a1b37882da1e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:04:48 compute-0 nova_compute[187639]: 2026-02-23 11:04:48.000 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:48 compute-0 nova_compute[187639]: 2026-02-23 11:04:48.187 187643 DEBUG nova.compute.manager [req-342c4176-d587-4ef9-8a69-9f9ccb437ae0 req-aa4e0433-73e3-442e-842a-9879dcf42b3f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received event network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:04:48 compute-0 nova_compute[187639]: 2026-02-23 11:04:48.187 187643 DEBUG oslo_concurrency.lockutils [req-342c4176-d587-4ef9-8a69-9f9ccb437ae0 req-aa4e0433-73e3-442e-842a-9879dcf42b3f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:04:48 compute-0 nova_compute[187639]: 2026-02-23 11:04:48.187 187643 DEBUG oslo_concurrency.lockutils [req-342c4176-d587-4ef9-8a69-9f9ccb437ae0 req-aa4e0433-73e3-442e-842a-9879dcf42b3f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:04:48 compute-0 nova_compute[187639]: 2026-02-23 11:04:48.187 187643 DEBUG oslo_concurrency.lockutils [req-342c4176-d587-4ef9-8a69-9f9ccb437ae0 req-aa4e0433-73e3-442e-842a-9879dcf42b3f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:04:48 compute-0 nova_compute[187639]: 2026-02-23 11:04:48.188 187643 DEBUG nova.compute.manager [req-342c4176-d587-4ef9-8a69-9f9ccb437ae0 req-aa4e0433-73e3-442e-842a-9879dcf42b3f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] No waiting events found dispatching network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:04:48 compute-0 nova_compute[187639]: 2026-02-23 11:04:48.188 187643 WARNING nova.compute.manager [req-342c4176-d587-4ef9-8a69-9f9ccb437ae0 req-aa4e0433-73e3-442e-842a-9879dcf42b3f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received unexpected event network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 for instance with vm_state active and task_state None.
Feb 23 11:04:49 compute-0 podman[211590]: 2026-02-23 11:04:49.950659068 +0000 UTC m=+0.152750487 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:04:51 compute-0 nova_compute[187639]: 2026-02-23 11:04:51.127 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:53 compute-0 nova_compute[187639]: 2026-02-23 11:04:53.004 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:54 compute-0 podman[211616]: 2026-02-23 11:04:54.842197869 +0000 UTC m=+0.044939241 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, version=9.7, architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Feb 23 11:04:56 compute-0 nova_compute[187639]: 2026-02-23 11:04:56.151 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:57 compute-0 nova_compute[187639]: 2026-02-23 11:04:57.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:57 compute-0 nova_compute[187639]: 2026-02-23 11:04:57.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:04:57 compute-0 nova_compute[187639]: 2026-02-23 11:04:57.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:04:57 compute-0 nova_compute[187639]: 2026-02-23 11:04:57.899 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:04:57 compute-0 nova_compute[187639]: 2026-02-23 11:04:57.899 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:04:57 compute-0 nova_compute[187639]: 2026-02-23 11:04:57.900 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:04:57 compute-0 nova_compute[187639]: 2026-02-23 11:04:57.900 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid d977d6e0-416b-4f8f-a035-224ae3b856f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:04:58 compute-0 nova_compute[187639]: 2026-02-23 11:04:58.008 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:04:58 compute-0 ovn_controller[97601]: 2026-02-23T11:04:58Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:3c:12 10.100.0.14
Feb 23 11:04:58 compute-0 ovn_controller[97601]: 2026-02-23T11:04:58Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:3c:12 10.100.0.14
Feb 23 11:04:59 compute-0 podman[197002]: time="2026-02-23T11:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:04:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:04:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.894 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Updating instance_info_cache with network_info: [{"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.913 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-d977d6e0-416b-4f8f-a035-224ae3b856f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.913 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.913 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.914 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.914 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.914 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.914 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:04:59 compute-0 nova_compute[187639]: 2026-02-23 11:04:59.914 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:05:00 compute-0 sshd-session[211652]: Connection closed by authenticating user root 165.227.79.48 port 49340 [preauth]
Feb 23 11:05:00 compute-0 nova_compute[187639]: 2026-02-23 11:05:00.911 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:05:01 compute-0 nova_compute[187639]: 2026-02-23 11:05:01.201 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:01 compute-0 openstack_network_exporter[199919]: ERROR   11:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:05:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:05:01 compute-0 openstack_network_exporter[199919]: ERROR   11:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:05:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.719 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.720 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.720 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.720 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.818 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.894 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.895 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:02 compute-0 nova_compute[187639]: 2026-02-23 11:05:02.951 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.012 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.128 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.130 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5617MB free_disk=73.1772232055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.130 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.131 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.215 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance d977d6e0-416b-4f8f-a035-224ae3b856f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.215 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.216 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.254 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.281 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.358 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:05:03 compute-0 nova_compute[187639]: 2026-02-23 11:05:03.359 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:05 compute-0 nova_compute[187639]: 2026-02-23 11:05:05.360 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:05:06 compute-0 nova_compute[187639]: 2026-02-23 11:05:06.203 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:07 compute-0 podman[211661]: 2026-02-23 11:05:07.863702166 +0000 UTC m=+0.057818343 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:05:08 compute-0 nova_compute[187639]: 2026-02-23 11:05:08.015 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:09 compute-0 sshd-session[211686]: Invalid user admin from 143.198.30.3 port 47734
Feb 23 11:05:09 compute-0 sshd-session[211686]: Connection closed by invalid user admin 143.198.30.3 port 47734 [preauth]
Feb 23 11:05:11 compute-0 nova_compute[187639]: 2026-02-23 11:05:11.204 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:12.648 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:12.649 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:12.650 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:13 compute-0 nova_compute[187639]: 2026-02-23 11:05:13.016 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:15 compute-0 ovn_controller[97601]: 2026-02-23T11:05:15Z|00085|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 23 11:05:15 compute-0 podman[211688]: 2026-02-23 11:05:15.838715728 +0000 UTC m=+0.044820177 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 11:05:16 compute-0 nova_compute[187639]: 2026-02-23 11:05:16.234 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:18 compute-0 nova_compute[187639]: 2026-02-23 11:05:18.020 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:20 compute-0 nova_compute[187639]: 2026-02-23 11:05:20.436 187643 DEBUG nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Creating tmpfile /var/lib/nova/instances/tmppngpv1un to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 23 11:05:20 compute-0 nova_compute[187639]: 2026-02-23 11:05:20.438 187643 DEBUG nova.compute.manager [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppngpv1un',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 23 11:05:20 compute-0 podman[211708]: 2026-02-23 11:05:20.885630202 +0000 UTC m=+0.086890313 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:05:21 compute-0 nova_compute[187639]: 2026-02-23 11:05:21.235 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:21 compute-0 nova_compute[187639]: 2026-02-23 11:05:21.427 187643 DEBUG nova.compute.manager [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppngpv1un',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a737b68c-9a83-45bf-b334-56899aef5ec8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 23 11:05:21 compute-0 nova_compute[187639]: 2026-02-23 11:05:21.457 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-a737b68c-9a83-45bf-b334-56899aef5ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:05:21 compute-0 nova_compute[187639]: 2026-02-23 11:05:21.457 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-a737b68c-9a83-45bf-b334-56899aef5ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:05:21 compute-0 nova_compute[187639]: 2026-02-23 11:05:21.458 187643 DEBUG nova.network.neutron [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.025 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.539 187643 DEBUG nova.network.neutron [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Updating instance_info_cache with network_info: [{"id": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "address": "fa:16:3e:05:72:06", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6521eeb6-49", "ovs_interfaceid": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.591 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-a737b68c-9a83-45bf-b334-56899aef5ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.593 187643 DEBUG nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppngpv1un',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a737b68c-9a83-45bf-b334-56899aef5ec8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.594 187643 DEBUG nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Creating instance directory: /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.594 187643 DEBUG nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Creating disk.info with the contents: {'/var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk': 'qcow2', '/var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.595 187643 DEBUG nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.595 187643 DEBUG nova.objects.instance [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'trusted_certs' on Instance uuid a737b68c-9a83-45bf-b334-56899aef5ec8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.692 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.748 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.749 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.749 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.759 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.832 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.832 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.865 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.867 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.867 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.910 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.912 187643 DEBUG nova.virt.disk.api [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Checking if we can resize image /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.912 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.985 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.985 187643 DEBUG nova.virt.disk.api [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Cannot resize image /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.985 187643 DEBUG nova.objects.instance [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid a737b68c-9a83-45bf-b334-56899aef5ec8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:05:23 compute-0 nova_compute[187639]: 2026-02-23 11:05:23.998 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.020 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.021 187643 DEBUG nova.virt.libvirt.volume.remotefs [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk.config to /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.021 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk.config /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.447 187643 DEBUG oslo_concurrency.processutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8/disk.config /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.449 187643 DEBUG nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.453 187643 DEBUG nova.virt.libvirt.vif [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:04:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-247189889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-247189889',id=11,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:04:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-oo9hdlth',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:04:33Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=a737b68c-9a83-45bf-b334-56899aef5ec8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "address": "fa:16:3e:05:72:06", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6521eeb6-49", "ovs_interfaceid": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.454 187643 DEBUG nova.network.os_vif_util [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "address": "fa:16:3e:05:72:06", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6521eeb6-49", "ovs_interfaceid": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.455 187643 DEBUG nova.network.os_vif_util [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:72:06,bridge_name='br-int',has_traffic_filtering=True,id=6521eeb6-496a-4be1-bff6-f203d8b6df69,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6521eeb6-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.456 187643 DEBUG os_vif [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:72:06,bridge_name='br-int',has_traffic_filtering=True,id=6521eeb6-496a-4be1-bff6-f203d8b6df69,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6521eeb6-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.457 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.458 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.459 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.462 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.463 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6521eeb6-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.464 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6521eeb6-49, col_values=(('external_ids', {'iface-id': '6521eeb6-496a-4be1-bff6-f203d8b6df69', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:72:06', 'vm-uuid': 'a737b68c-9a83-45bf-b334-56899aef5ec8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.498 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:24 compute-0 NetworkManager[57207]: <info>  [1771844724.4989] manager: (tap6521eeb6-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.502 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.504 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.506 187643 INFO os_vif [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:72:06,bridge_name='br-int',has_traffic_filtering=True,id=6521eeb6-496a-4be1-bff6-f203d8b6df69,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6521eeb6-49')
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.506 187643 DEBUG nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 23 11:05:24 compute-0 nova_compute[187639]: 2026-02-23 11:05:24.507 187643 DEBUG nova.compute.manager [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppngpv1un',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a737b68c-9a83-45bf-b334-56899aef5ec8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 23 11:05:25 compute-0 podman[211756]: 2026-02-23 11:05:25.866823696 +0000 UTC m=+0.072495702 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 23 11:05:26 compute-0 nova_compute[187639]: 2026-02-23 11:05:26.237 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:28 compute-0 nova_compute[187639]: 2026-02-23 11:05:28.100 187643 DEBUG nova.network.neutron [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Port 6521eeb6-496a-4be1-bff6-f203d8b6df69 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 23 11:05:28 compute-0 nova_compute[187639]: 2026-02-23 11:05:28.102 187643 DEBUG nova.compute.manager [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppngpv1un',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a737b68c-9a83-45bf-b334-56899aef5ec8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 23 11:05:28 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 23 11:05:28 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 23 11:05:28 compute-0 NetworkManager[57207]: <info>  [1771844728.3349] manager: (tap6521eeb6-49): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Feb 23 11:05:28 compute-0 kernel: tap6521eeb6-49: entered promiscuous mode
Feb 23 11:05:28 compute-0 ovn_controller[97601]: 2026-02-23T11:05:28Z|00086|binding|INFO|Claiming lport 6521eeb6-496a-4be1-bff6-f203d8b6df69 for this additional chassis.
Feb 23 11:05:28 compute-0 nova_compute[187639]: 2026-02-23 11:05:28.396 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:28 compute-0 ovn_controller[97601]: 2026-02-23T11:05:28Z|00087|binding|INFO|6521eeb6-496a-4be1-bff6-f203d8b6df69: Claiming fa:16:3e:05:72:06 10.100.0.11
Feb 23 11:05:28 compute-0 ovn_controller[97601]: 2026-02-23T11:05:28Z|00088|binding|INFO|Setting lport 6521eeb6-496a-4be1-bff6-f203d8b6df69 ovn-installed in OVS
Feb 23 11:05:28 compute-0 nova_compute[187639]: 2026-02-23 11:05:28.406 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:28 compute-0 systemd-machined[156970]: New machine qemu-8-instance-0000000b.
Feb 23 11:05:28 compute-0 systemd-udevd[211813]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:05:28 compute-0 NetworkManager[57207]: <info>  [1771844728.4313] device (tap6521eeb6-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:05:28 compute-0 NetworkManager[57207]: <info>  [1771844728.4322] device (tap6521eeb6-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:05:28 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000b.
Feb 23 11:05:29 compute-0 nova_compute[187639]: 2026-02-23 11:05:29.533 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:29 compute-0 nova_compute[187639]: 2026-02-23 11:05:29.743 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844729.7426474, a737b68c-9a83-45bf-b334-56899aef5ec8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:05:29 compute-0 nova_compute[187639]: 2026-02-23 11:05:29.743 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] VM Started (Lifecycle Event)
Feb 23 11:05:29 compute-0 podman[197002]: time="2026-02-23T11:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:05:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:05:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 23 11:05:29 compute-0 nova_compute[187639]: 2026-02-23 11:05:29.771 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:05:30 compute-0 nova_compute[187639]: 2026-02-23 11:05:30.518 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844730.518353, a737b68c-9a83-45bf-b334-56899aef5ec8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:05:30 compute-0 nova_compute[187639]: 2026-02-23 11:05:30.519 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] VM Resumed (Lifecycle Event)
Feb 23 11:05:30 compute-0 nova_compute[187639]: 2026-02-23 11:05:30.548 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:05:30 compute-0 nova_compute[187639]: 2026-02-23 11:05:30.550 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:05:30 compute-0 nova_compute[187639]: 2026-02-23 11:05:30.567 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 23 11:05:31 compute-0 nova_compute[187639]: 2026-02-23 11:05:31.239 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:31 compute-0 openstack_network_exporter[199919]: ERROR   11:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:05:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:05:31 compute-0 openstack_network_exporter[199919]: ERROR   11:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:05:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:05:31 compute-0 ovn_controller[97601]: 2026-02-23T11:05:31Z|00089|binding|INFO|Claiming lport 6521eeb6-496a-4be1-bff6-f203d8b6df69 for this chassis.
Feb 23 11:05:31 compute-0 ovn_controller[97601]: 2026-02-23T11:05:31Z|00090|binding|INFO|6521eeb6-496a-4be1-bff6-f203d8b6df69: Claiming fa:16:3e:05:72:06 10.100.0.11
Feb 23 11:05:31 compute-0 ovn_controller[97601]: 2026-02-23T11:05:31Z|00091|binding|INFO|Setting lport 6521eeb6-496a-4be1-bff6-f203d8b6df69 up in Southbound
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.534 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:72:06 10.100.0.11'], port_security=['fa:16:3e:05:72:06 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a737b68c-9a83-45bf-b334-56899aef5ec8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '11', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=6521eeb6-496a-4be1-bff6-f203d8b6df69) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.535 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 6521eeb6-496a-4be1-bff6-f203d8b6df69 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.537 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.552 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[fffe971e-abb2-4366-80b8-73f946394957]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:31 compute-0 nova_compute[187639]: 2026-02-23 11:05:31.555 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.556 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.574 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9e5071-ba63-4931-a86a-38187037fa63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.579 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[a279b753-1192-4e79-9c4c-635f0cd84dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.606 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[61394354-ec95-4fd4-a292-147f9779bcdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.624 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[16a4783b-64b2-4ca8-a556-e07463ecaa05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382945, 'reachable_time': 32492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211845, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.643 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[32016b63-0ff2-4eff-b0fc-20a8e668f7ce]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382955, 'tstamp': 382955}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211846, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382957, 'tstamp': 382957}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211846, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.646 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:31 compute-0 nova_compute[187639]: 2026-02-23 11:05:31.648 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.649 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.650 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.650 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.651 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:05:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:31.652 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:05:31 compute-0 nova_compute[187639]: 2026-02-23 11:05:31.842 187643 INFO nova.compute.manager [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Post operation of migration started
Feb 23 11:05:32 compute-0 nova_compute[187639]: 2026-02-23 11:05:32.393 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-a737b68c-9a83-45bf-b334-56899aef5ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:05:32 compute-0 nova_compute[187639]: 2026-02-23 11:05:32.394 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-a737b68c-9a83-45bf-b334-56899aef5ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:05:32 compute-0 nova_compute[187639]: 2026-02-23 11:05:32.394 187643 DEBUG nova.network.neutron [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:05:34 compute-0 nova_compute[187639]: 2026-02-23 11:05:34.263 187643 DEBUG nova.network.neutron [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Updating instance_info_cache with network_info: [{"id": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "address": "fa:16:3e:05:72:06", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6521eeb6-49", "ovs_interfaceid": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:05:34 compute-0 nova_compute[187639]: 2026-02-23 11:05:34.289 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-a737b68c-9a83-45bf-b334-56899aef5ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:05:34 compute-0 nova_compute[187639]: 2026-02-23 11:05:34.312 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:34 compute-0 nova_compute[187639]: 2026-02-23 11:05:34.312 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:34 compute-0 nova_compute[187639]: 2026-02-23 11:05:34.313 187643 DEBUG oslo_concurrency.lockutils [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:34 compute-0 nova_compute[187639]: 2026-02-23 11:05:34.319 187643 INFO nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 23 11:05:34 compute-0 virtqemud[186733]: Domain id=8 name='instance-0000000b' uuid=a737b68c-9a83-45bf-b334-56899aef5ec8 is tainted: custom-monitor
Feb 23 11:05:34 compute-0 nova_compute[187639]: 2026-02-23 11:05:34.579 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:35 compute-0 nova_compute[187639]: 2026-02-23 11:05:35.326 187643 INFO nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 23 11:05:36 compute-0 nova_compute[187639]: 2026-02-23 11:05:36.242 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:36 compute-0 nova_compute[187639]: 2026-02-23 11:05:36.332 187643 INFO nova.virt.libvirt.driver [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 23 11:05:36 compute-0 nova_compute[187639]: 2026-02-23 11:05:36.337 187643 DEBUG nova.compute.manager [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:05:36 compute-0 nova_compute[187639]: 2026-02-23 11:05:36.383 187643 DEBUG nova.objects.instance [None req-46b54b0d-2b99-4d1a-9470-9d731de822e7 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 23 11:05:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:36.654 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:38 compute-0 podman[211847]: 2026-02-23 11:05:38.851355611 +0000 UTC m=+0.050901439 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 11:05:39 compute-0 nova_compute[187639]: 2026-02-23 11:05:39.581 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:41 compute-0 sshd-session[211872]: Invalid user admin from 143.198.30.3 port 38962
Feb 23 11:05:41 compute-0 nova_compute[187639]: 2026-02-23 11:05:41.244 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:41 compute-0 sshd-session[211872]: Connection closed by invalid user admin 143.198.30.3 port 38962 [preauth]
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.105 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.105 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.106 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.106 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.107 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.108 187643 INFO nova.compute.manager [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Terminating instance
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.110 187643 DEBUG nova.compute.manager [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:05:42 compute-0 kernel: tap6f6eec7d-31 (unregistering): left promiscuous mode
Feb 23 11:05:42 compute-0 NetworkManager[57207]: <info>  [1771844742.1424] device (tap6f6eec7d-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.179 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 ovn_controller[97601]: 2026-02-23T11:05:42Z|00092|binding|INFO|Releasing lport 6f6eec7d-316d-4eca-bab2-007acd8bc545 from this chassis (sb_readonly=0)
Feb 23 11:05:42 compute-0 ovn_controller[97601]: 2026-02-23T11:05:42Z|00093|binding|INFO|Setting lport 6f6eec7d-316d-4eca-bab2-007acd8bc545 down in Southbound
Feb 23 11:05:42 compute-0 ovn_controller[97601]: 2026-02-23T11:05:42Z|00094|binding|INFO|Removing iface tap6f6eec7d-31 ovn-installed in OVS
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.182 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.184 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.196 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:3c:12 10.100.0.14'], port_security=['fa:16:3e:c9:3c:12 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd977d6e0-416b-4f8f-a035-224ae3b856f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=6f6eec7d-316d-4eca-bab2-007acd8bc545) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.199 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 6f6eec7d-316d-4eca-bab2-007acd8bc545 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.202 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:05:42 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Feb 23 11:05:42 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000c.scope: Consumed 14.156s CPU time.
Feb 23 11:05:42 compute-0 systemd-machined[156970]: Machine qemu-7-instance-0000000c terminated.
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.217 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e16921-1119-424f-85c7-47eec56baca0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.236 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[8703aadc-19f2-474a-be27-7a11376d528d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.238 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[c480b1d5-2785-4a80-99f5-f733a921b74f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.262 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[d7fe1e54-67b6-4dfa-bdf5-2e5fc6c046cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.278 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3863a90e-6eb5-4b01-a4a2-06d766c75228]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382945, 'reachable_time': 32492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211885, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.293 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[c44fc843-bbc6-45d0-aa4a-ed0e2e978593]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382955, 'tstamp': 382955}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211886, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382957, 'tstamp': 382957}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211886, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.295 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.297 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.300 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.301 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.302 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.303 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:42 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:42.304 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.325 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.330 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.357 187643 INFO nova.virt.libvirt.driver [-] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Instance destroyed successfully.
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.358 187643 DEBUG nova.objects.instance [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid d977d6e0-416b-4f8f-a035-224ae3b856f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.377 187643 DEBUG nova.virt.libvirt.vif [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1024966650',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1024966650',id=12,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-euhuqnsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:04:46Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=d977d6e0-416b-4f8f-a035-224ae3b856f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.378 187643 DEBUG nova.network.os_vif_util [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "address": "fa:16:3e:c9:3c:12", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6eec7d-31", "ovs_interfaceid": "6f6eec7d-316d-4eca-bab2-007acd8bc545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.378 187643 DEBUG nova.network.os_vif_util [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:3c:12,bridge_name='br-int',has_traffic_filtering=True,id=6f6eec7d-316d-4eca-bab2-007acd8bc545,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6eec7d-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.379 187643 DEBUG os_vif [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:3c:12,bridge_name='br-int',has_traffic_filtering=True,id=6f6eec7d-316d-4eca-bab2-007acd8bc545,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6eec7d-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.380 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.380 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f6eec7d-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.381 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.383 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.385 187643 INFO os_vif [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:3c:12,bridge_name='br-int',has_traffic_filtering=True,id=6f6eec7d-316d-4eca-bab2-007acd8bc545,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6eec7d-31')
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.386 187643 INFO nova.virt.libvirt.driver [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Deleting instance files /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8_del
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.387 187643 INFO nova.virt.libvirt.driver [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Deletion of /var/lib/nova/instances/d977d6e0-416b-4f8f-a035-224ae3b856f8_del complete
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.464 187643 INFO nova.compute.manager [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.465 187643 DEBUG oslo.service.loopingcall [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.465 187643 DEBUG nova.compute.manager [-] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:05:42 compute-0 nova_compute[187639]: 2026-02-23 11:05:42.466 187643 DEBUG nova.network.neutron [-] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:05:43 compute-0 nova_compute[187639]: 2026-02-23 11:05:43.115 187643 DEBUG nova.compute.manager [req-33891c8c-0eda-4252-9682-880deff1dac4 req-39dc710e-8b0b-4b61-ba7b-e9b0668efd14 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received event network-vif-unplugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:05:43 compute-0 nova_compute[187639]: 2026-02-23 11:05:43.116 187643 DEBUG oslo_concurrency.lockutils [req-33891c8c-0eda-4252-9682-880deff1dac4 req-39dc710e-8b0b-4b61-ba7b-e9b0668efd14 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:43 compute-0 nova_compute[187639]: 2026-02-23 11:05:43.117 187643 DEBUG oslo_concurrency.lockutils [req-33891c8c-0eda-4252-9682-880deff1dac4 req-39dc710e-8b0b-4b61-ba7b-e9b0668efd14 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:43 compute-0 nova_compute[187639]: 2026-02-23 11:05:43.118 187643 DEBUG oslo_concurrency.lockutils [req-33891c8c-0eda-4252-9682-880deff1dac4 req-39dc710e-8b0b-4b61-ba7b-e9b0668efd14 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:43 compute-0 nova_compute[187639]: 2026-02-23 11:05:43.118 187643 DEBUG nova.compute.manager [req-33891c8c-0eda-4252-9682-880deff1dac4 req-39dc710e-8b0b-4b61-ba7b-e9b0668efd14 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] No waiting events found dispatching network-vif-unplugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:05:43 compute-0 nova_compute[187639]: 2026-02-23 11:05:43.119 187643 DEBUG nova.compute.manager [req-33891c8c-0eda-4252-9682-880deff1dac4 req-39dc710e-8b0b-4b61-ba7b-e9b0668efd14 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received event network-vif-unplugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:05:44 compute-0 nova_compute[187639]: 2026-02-23 11:05:44.988 187643 DEBUG nova.network.neutron [-] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.015 187643 INFO nova.compute.manager [-] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Took 2.55 seconds to deallocate network for instance.
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.063 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.064 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.113 187643 DEBUG nova.compute.manager [req-a9b08e05-b6c3-4c2a-90b7-ffcbfecb823a req-482fc32d-8451-4fb4-b6ee-938c21ccf762 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received event network-vif-deleted-6f6eec7d-316d-4eca-bab2-007acd8bc545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.164 187643 DEBUG nova.compute.provider_tree [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.185 187643 DEBUG nova.scheduler.client.report [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.196 187643 DEBUG nova.compute.manager [req-65270d3c-410a-4dc1-8beb-a394e778940b req-37d8d8ca-526d-4aaf-89e0-af4fa279fdc9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received event network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.197 187643 DEBUG oslo_concurrency.lockutils [req-65270d3c-410a-4dc1-8beb-a394e778940b req-37d8d8ca-526d-4aaf-89e0-af4fa279fdc9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.197 187643 DEBUG oslo_concurrency.lockutils [req-65270d3c-410a-4dc1-8beb-a394e778940b req-37d8d8ca-526d-4aaf-89e0-af4fa279fdc9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.198 187643 DEBUG oslo_concurrency.lockutils [req-65270d3c-410a-4dc1-8beb-a394e778940b req-37d8d8ca-526d-4aaf-89e0-af4fa279fdc9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.198 187643 DEBUG nova.compute.manager [req-65270d3c-410a-4dc1-8beb-a394e778940b req-37d8d8ca-526d-4aaf-89e0-af4fa279fdc9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] No waiting events found dispatching network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.198 187643 WARNING nova.compute.manager [req-65270d3c-410a-4dc1-8beb-a394e778940b req-37d8d8ca-526d-4aaf-89e0-af4fa279fdc9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Received unexpected event network-vif-plugged-6f6eec7d-316d-4eca-bab2-007acd8bc545 for instance with vm_state deleted and task_state None.
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.221 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.250 187643 INFO nova.scheduler.client.report [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance d977d6e0-416b-4f8f-a035-224ae3b856f8
Feb 23 11:05:45 compute-0 nova_compute[187639]: 2026-02-23 11:05:45.314 187643 DEBUG oslo_concurrency.lockutils [None req-13df06d0-e778-45f8-abf8-a25fe4f0600a 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "d977d6e0-416b-4f8f-a035-224ae3b856f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.286 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.659 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "a737b68c-9a83-45bf-b334-56899aef5ec8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.660 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.660 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.661 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.661 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.663 187643 INFO nova.compute.manager [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Terminating instance
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.664 187643 DEBUG nova.compute.manager [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:05:46 compute-0 kernel: tap6521eeb6-49 (unregistering): left promiscuous mode
Feb 23 11:05:46 compute-0 NetworkManager[57207]: <info>  [1771844746.6881] device (tap6521eeb6-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.690 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 ovn_controller[97601]: 2026-02-23T11:05:46Z|00095|binding|INFO|Releasing lport 6521eeb6-496a-4be1-bff6-f203d8b6df69 from this chassis (sb_readonly=0)
Feb 23 11:05:46 compute-0 ovn_controller[97601]: 2026-02-23T11:05:46Z|00096|binding|INFO|Setting lport 6521eeb6-496a-4be1-bff6-f203d8b6df69 down in Southbound
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.695 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 ovn_controller[97601]: 2026-02-23T11:05:46Z|00097|binding|INFO|Removing iface tap6521eeb6-49 ovn-installed in OVS
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.697 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.699 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.704 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:72:06 10.100.0.11'], port_security=['fa:16:3e:05:72:06 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a737b68c-9a83-45bf-b334-56899aef5ec8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '13', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=6521eeb6-496a-4be1-bff6-f203d8b6df69) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.706 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 6521eeb6-496a-4be1-bff6-f203d8b6df69 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.709 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.711 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4f39eb-f594-47b4-9a78-795ab794eea7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.712 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace which is not needed anymore
Feb 23 11:05:46 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 23 11:05:46 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Consumed 2.324s CPU time.
Feb 23 11:05:46 compute-0 systemd-machined[156970]: Machine qemu-8-instance-0000000b terminated.
Feb 23 11:05:46 compute-0 podman[211906]: 2026-02-23 11:05:46.82471107 +0000 UTC m=+0.106209315 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 23 11:05:46 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[211568]: [NOTICE]   (211578) : haproxy version is 2.8.14-c23fe91
Feb 23 11:05:46 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[211568]: [NOTICE]   (211578) : path to executable is /usr/sbin/haproxy
Feb 23 11:05:46 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[211568]: [WARNING]  (211578) : Exiting Master process...
Feb 23 11:05:46 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[211568]: [ALERT]    (211578) : Current worker (211580) exited with code 143 (Terminated)
Feb 23 11:05:46 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[211568]: [WARNING]  (211578) : All workers exited. Exiting... (0)
Feb 23 11:05:46 compute-0 systemd[1]: libpod-1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513.scope: Deactivated successfully.
Feb 23 11:05:46 compute-0 podman[211943]: 2026-02-23 11:05:46.857198271 +0000 UTC m=+0.048176707 container died 1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.882 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513-userdata-shm.mount: Deactivated successfully.
Feb 23 11:05:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-d759770d93fe9b7128bd1e1296552a87b4cea0b1d69289350bde6b6d48540ffc-merged.mount: Deactivated successfully.
Feb 23 11:05:46 compute-0 sshd-session[211938]: Connection closed by authenticating user root 165.227.79.48 port 42542 [preauth]
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.889 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 podman[211943]: 2026-02-23 11:05:46.894384866 +0000 UTC m=+0.085363302 container cleanup 1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:05:46 compute-0 systemd[1]: libpod-conmon-1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513.scope: Deactivated successfully.
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.913 187643 INFO nova.virt.libvirt.driver [-] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Instance destroyed successfully.
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.914 187643 DEBUG nova.objects.instance [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid a737b68c-9a83-45bf-b334-56899aef5ec8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.930 187643 DEBUG nova.virt.libvirt.vif [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-23T11:04:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-247189889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-247189889',id=11,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:04:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-oo9hdlth',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:05:36Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=a737b68c-9a83-45bf-b334-56899aef5ec8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "address": "fa:16:3e:05:72:06", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6521eeb6-49", "ovs_interfaceid": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.931 187643 DEBUG nova.network.os_vif_util [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "address": "fa:16:3e:05:72:06", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6521eeb6-49", "ovs_interfaceid": "6521eeb6-496a-4be1-bff6-f203d8b6df69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.932 187643 DEBUG nova.network.os_vif_util [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:72:06,bridge_name='br-int',has_traffic_filtering=True,id=6521eeb6-496a-4be1-bff6-f203d8b6df69,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6521eeb6-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.932 187643 DEBUG os_vif [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:72:06,bridge_name='br-int',has_traffic_filtering=True,id=6521eeb6-496a-4be1-bff6-f203d8b6df69,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6521eeb6-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.934 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.935 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6521eeb6-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.936 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.939 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.941 187643 INFO os_vif [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:72:06,bridge_name='br-int',has_traffic_filtering=True,id=6521eeb6-496a-4be1-bff6-f203d8b6df69,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6521eeb6-49')
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.941 187643 INFO nova.virt.libvirt.driver [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Deleting instance files /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8_del
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.942 187643 INFO nova.virt.libvirt.driver [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Deletion of /var/lib/nova/instances/a737b68c-9a83-45bf-b334-56899aef5ec8_del complete
Feb 23 11:05:46 compute-0 podman[211988]: 2026-02-23 11:05:46.944539765 +0000 UTC m=+0.034812473 container remove 1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.947 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b0cda5-a9e8-484a-9c68-453bf2cacd11]: (4, ('Mon Feb 23 11:05:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513)\n1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513\nMon Feb 23 11:05:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513)\n1086bc5289d837efe813d6d005eb7c8697dc823cb115264348585469837ff513\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.948 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[13894d99-5f85-4cf1-838d-46ebc651c3a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.949 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.950 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 kernel: tap4b12da8d-30: left promiscuous mode
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.954 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.956 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e4375f28-d1b5-4e12-9407-fdd9c1b1e4a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.981 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[08b06e21-2099-4141-b774-dae4ad19d3ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.982 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[63401717-4a78-420f-915f-595db32b4f9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.982 187643 INFO nova.compute.manager [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Took 0.32 seconds to destroy the instance on the hypervisor.
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.983 187643 DEBUG oslo.service.loopingcall [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.984 187643 DEBUG nova.compute.manager [-] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:05:46 compute-0 nova_compute[187639]: 2026-02-23 11:05:46.984 187643 DEBUG nova.network.neutron [-] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.991 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2081fec3-470e-41ca-bbe5-263f7d22f3a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382939, 'reachable_time': 40554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212006, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b12da8d\x2d3150\x2d4d44\x2db948\x2d8d49ddadedef.mount: Deactivated successfully.
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.995 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:05:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:05:46.995 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[79f8c6d2-d545-4fa0-99ed-c17dbadbefe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.306 187643 DEBUG nova.compute.manager [req-9e26ce69-577a-4107-a28e-675e78fb30fd req-c9d4ce20-43d0-454f-8276-2400debb29b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Received event network-vif-unplugged-6521eeb6-496a-4be1-bff6-f203d8b6df69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.307 187643 DEBUG oslo_concurrency.lockutils [req-9e26ce69-577a-4107-a28e-675e78fb30fd req-c9d4ce20-43d0-454f-8276-2400debb29b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.308 187643 DEBUG oslo_concurrency.lockutils [req-9e26ce69-577a-4107-a28e-675e78fb30fd req-c9d4ce20-43d0-454f-8276-2400debb29b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.308 187643 DEBUG oslo_concurrency.lockutils [req-9e26ce69-577a-4107-a28e-675e78fb30fd req-c9d4ce20-43d0-454f-8276-2400debb29b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.309 187643 DEBUG nova.compute.manager [req-9e26ce69-577a-4107-a28e-675e78fb30fd req-c9d4ce20-43d0-454f-8276-2400debb29b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] No waiting events found dispatching network-vif-unplugged-6521eeb6-496a-4be1-bff6-f203d8b6df69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.309 187643 DEBUG nova.compute.manager [req-9e26ce69-577a-4107-a28e-675e78fb30fd req-c9d4ce20-43d0-454f-8276-2400debb29b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Received event network-vif-unplugged-6521eeb6-496a-4be1-bff6-f203d8b6df69 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.634 187643 DEBUG nova.network.neutron [-] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.729 187643 INFO nova.compute.manager [-] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Took 0.75 seconds to deallocate network for instance.
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.888 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.889 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.895 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:47 compute-0 nova_compute[187639]: 2026-02-23 11:05:47.956 187643 INFO nova.scheduler.client.report [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance a737b68c-9a83-45bf-b334-56899aef5ec8
Feb 23 11:05:48 compute-0 nova_compute[187639]: 2026-02-23 11:05:48.064 187643 DEBUG oslo_concurrency.lockutils [None req-616adb75-27b3-4966-9585-ec2b2fe49048 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:49 compute-0 nova_compute[187639]: 2026-02-23 11:05:49.396 187643 DEBUG nova.compute.manager [req-9220d128-49ea-4422-a274-918989f1bcb7 req-d01bc0be-fd45-438a-8ba7-14c9cf4b51b4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Received event network-vif-plugged-6521eeb6-496a-4be1-bff6-f203d8b6df69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:05:49 compute-0 nova_compute[187639]: 2026-02-23 11:05:49.396 187643 DEBUG oslo_concurrency.lockutils [req-9220d128-49ea-4422-a274-918989f1bcb7 req-d01bc0be-fd45-438a-8ba7-14c9cf4b51b4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:05:49 compute-0 nova_compute[187639]: 2026-02-23 11:05:49.397 187643 DEBUG oslo_concurrency.lockutils [req-9220d128-49ea-4422-a274-918989f1bcb7 req-d01bc0be-fd45-438a-8ba7-14c9cf4b51b4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:05:49 compute-0 nova_compute[187639]: 2026-02-23 11:05:49.397 187643 DEBUG oslo_concurrency.lockutils [req-9220d128-49ea-4422-a274-918989f1bcb7 req-d01bc0be-fd45-438a-8ba7-14c9cf4b51b4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a737b68c-9a83-45bf-b334-56899aef5ec8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:05:49 compute-0 nova_compute[187639]: 2026-02-23 11:05:49.397 187643 DEBUG nova.compute.manager [req-9220d128-49ea-4422-a274-918989f1bcb7 req-d01bc0be-fd45-438a-8ba7-14c9cf4b51b4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] No waiting events found dispatching network-vif-plugged-6521eeb6-496a-4be1-bff6-f203d8b6df69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:05:49 compute-0 nova_compute[187639]: 2026-02-23 11:05:49.397 187643 WARNING nova.compute.manager [req-9220d128-49ea-4422-a274-918989f1bcb7 req-d01bc0be-fd45-438a-8ba7-14c9cf4b51b4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Received unexpected event network-vif-plugged-6521eeb6-496a-4be1-bff6-f203d8b6df69 for instance with vm_state deleted and task_state None.
Feb 23 11:05:49 compute-0 nova_compute[187639]: 2026-02-23 11:05:49.397 187643 DEBUG nova.compute.manager [req-9220d128-49ea-4422-a274-918989f1bcb7 req-d01bc0be-fd45-438a-8ba7-14c9cf4b51b4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Received event network-vif-deleted-6521eeb6-496a-4be1-bff6-f203d8b6df69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:05:51 compute-0 nova_compute[187639]: 2026-02-23 11:05:51.290 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:51 compute-0 podman[212009]: 2026-02-23 11:05:51.866535139 +0000 UTC m=+0.070815768 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 11:05:51 compute-0 nova_compute[187639]: 2026-02-23 11:05:51.936 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:52 compute-0 sshd-session[212007]: Connection closed by authenticating user root 80.94.95.115 port 50286 [preauth]
Feb 23 11:05:56 compute-0 nova_compute[187639]: 2026-02-23 11:05:56.293 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:56 compute-0 podman[212036]: 2026-02-23 11:05:56.85525333 +0000 UTC m=+0.052603155 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.7, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 23 11:05:56 compute-0 nova_compute[187639]: 2026-02-23 11:05:56.937 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:05:57 compute-0 nova_compute[187639]: 2026-02-23 11:05:57.357 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844742.3560174, d977d6e0-416b-4f8f-a035-224ae3b856f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:05:57 compute-0 nova_compute[187639]: 2026-02-23 11:05:57.357 187643 INFO nova.compute.manager [-] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] VM Stopped (Lifecycle Event)
Feb 23 11:05:57 compute-0 nova_compute[187639]: 2026-02-23 11:05:57.378 187643 DEBUG nova.compute.manager [None req-3b7a3992-f1af-41e9-b99d-2bb0095b947c - - - - - -] [instance: d977d6e0-416b-4f8f-a035-224ae3b856f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:05:57 compute-0 nova_compute[187639]: 2026-02-23 11:05:57.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:05:57 compute-0 nova_compute[187639]: 2026-02-23 11:05:57.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:05:58 compute-0 nova_compute[187639]: 2026-02-23 11:05:58.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:05:59 compute-0 nova_compute[187639]: 2026-02-23 11:05:59.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:05:59 compute-0 nova_compute[187639]: 2026-02-23 11:05:59.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:05:59 compute-0 nova_compute[187639]: 2026-02-23 11:05:59.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:05:59 compute-0 nova_compute[187639]: 2026-02-23 11:05:59.708 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:05:59 compute-0 nova_compute[187639]: 2026-02-23 11:05:59.709 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:05:59 compute-0 podman[197002]: time="2026-02-23T11:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:05:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:05:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 23 11:06:00 compute-0 nova_compute[187639]: 2026-02-23 11:06:00.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:00 compute-0 nova_compute[187639]: 2026-02-23 11:06:00.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:01 compute-0 nova_compute[187639]: 2026-02-23 11:06:01.294 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:01 compute-0 openstack_network_exporter[199919]: ERROR   11:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:06:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:06:01 compute-0 openstack_network_exporter[199919]: ERROR   11:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:06:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:06:01 compute-0 nova_compute[187639]: 2026-02-23 11:06:01.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:01 compute-0 nova_compute[187639]: 2026-02-23 11:06:01.911 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844746.9103065, a737b68c-9a83-45bf-b334-56899aef5ec8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:06:01 compute-0 nova_compute[187639]: 2026-02-23 11:06:01.912 187643 INFO nova.compute.manager [-] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] VM Stopped (Lifecycle Event)
Feb 23 11:06:01 compute-0 nova_compute[187639]: 2026-02-23 11:06:01.934 187643 DEBUG nova.compute.manager [None req-3e62b3b2-46c9-4870-b0f0-51f97a2a850e - - - - - -] [instance: a737b68c-9a83-45bf-b334-56899aef5ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:06:01 compute-0 nova_compute[187639]: 2026-02-23 11:06:01.939 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.716 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.717 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.717 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.717 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.883 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.885 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5827MB free_disk=73.2059555053711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.885 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.885 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.997 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:06:04 compute-0 nova_compute[187639]: 2026-02-23 11:06:04.998 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.022 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.048 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.048 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.062 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.086 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.110 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.125 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.146 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:06:05 compute-0 nova_compute[187639]: 2026-02-23 11:06:05.146 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:06 compute-0 nova_compute[187639]: 2026-02-23 11:06:06.295 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:06 compute-0 nova_compute[187639]: 2026-02-23 11:06:06.941 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:07 compute-0 nova_compute[187639]: 2026-02-23 11:06:07.147 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:09 compute-0 nova_compute[187639]: 2026-02-23 11:06:09.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:09 compute-0 podman[212059]: 2026-02-23 11:06:09.847994713 +0000 UTC m=+0.054368362 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:06:11 compute-0 nova_compute[187639]: 2026-02-23 11:06:11.336 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:11 compute-0 nova_compute[187639]: 2026-02-23 11:06:11.943 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:12.649 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:12.650 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:12.650 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:14 compute-0 sshd-session[212083]: Invalid user admin from 143.198.30.3 port 43102
Feb 23 11:06:14 compute-0 sshd-session[212083]: Connection closed by invalid user admin 143.198.30.3 port 43102 [preauth]
Feb 23 11:06:16 compute-0 nova_compute[187639]: 2026-02-23 11:06:16.387 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:16 compute-0 nova_compute[187639]: 2026-02-23 11:06:16.945 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:17 compute-0 ovn_controller[97601]: 2026-02-23T11:06:17Z|00098|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 23 11:06:17 compute-0 podman[212085]: 2026-02-23 11:06:17.840383465 +0000 UTC m=+0.043587646 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 11:06:21 compute-0 nova_compute[187639]: 2026-02-23 11:06:21.427 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:21 compute-0 nova_compute[187639]: 2026-02-23 11:06:21.946 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:22 compute-0 podman[212102]: 2026-02-23 11:06:22.867910045 +0000 UTC m=+0.070773816 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:06:26 compute-0 nova_compute[187639]: 2026-02-23 11:06:26.462 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:26 compute-0 nova_compute[187639]: 2026-02-23 11:06:26.948 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:27 compute-0 podman[212128]: 2026-02-23 11:06:27.865356869 +0000 UTC m=+0.067073998 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, release=1770267347, vcs-type=git, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 11:06:29 compute-0 podman[197002]: time="2026-02-23T11:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:06:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:06:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 23 11:06:31 compute-0 openstack_network_exporter[199919]: ERROR   11:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:06:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:06:31 compute-0 openstack_network_exporter[199919]: ERROR   11:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:06:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:06:31 compute-0 nova_compute[187639]: 2026-02-23 11:06:31.502 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:31 compute-0 nova_compute[187639]: 2026-02-23 11:06:31.949 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:35 compute-0 sshd-session[212149]: Connection closed by authenticating user root 165.227.79.48 port 58082 [preauth]
Feb 23 11:06:36 compute-0 nova_compute[187639]: 2026-02-23 11:06:36.546 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:36 compute-0 nova_compute[187639]: 2026-02-23 11:06:36.950 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.571 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.572 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.605 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.726 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.727 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.737 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.737 187643 INFO nova.compute.claims [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.859 187643 DEBUG nova.compute.provider_tree [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.878 187643 DEBUG nova.scheduler.client.report [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.912 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.913 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.973 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:06:38 compute-0 nova_compute[187639]: 2026-02-23 11:06:38.973 187643 DEBUG nova.network.neutron [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.001 187643 INFO nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.017 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.126 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.127 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.128 187643 INFO nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Creating image(s)
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.129 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "/var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.130 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.131 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.154 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.230 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.232 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.233 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.254 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.324 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.326 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.354 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.355 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.355 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.403 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.404 187643 DEBUG nova.virt.disk.api [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Checking if we can resize image /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.405 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.447 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.449 187643 DEBUG nova.virt.disk.api [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Cannot resize image /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.449 187643 DEBUG nova.objects.instance [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'migration_context' on Instance uuid cf40a379-a79f-48d0-9b8d-15588edbccbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.471 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.471 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Ensure instance console log exists: /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.472 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.473 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:39 compute-0 nova_compute[187639]: 2026-02-23 11:06:39.473 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:40 compute-0 nova_compute[187639]: 2026-02-23 11:06:40.093 187643 DEBUG nova.policy [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48814d91aad6418f9d55fc9967ed0087', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:06:40 compute-0 podman[212166]: 2026-02-23 11:06:40.89004525 +0000 UTC m=+0.091826074 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:06:41 compute-0 nova_compute[187639]: 2026-02-23 11:06:41.606 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:41 compute-0 nova_compute[187639]: 2026-02-23 11:06:41.620 187643 DEBUG nova.network.neutron [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Successfully created port: 3dc893be-d39b-4ea7-baae-7b504533898e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:06:41 compute-0 nova_compute[187639]: 2026-02-23 11:06:41.952 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:42 compute-0 nova_compute[187639]: 2026-02-23 11:06:42.331 187643 DEBUG nova.network.neutron [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Successfully updated port: 3dc893be-d39b-4ea7-baae-7b504533898e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:06:42 compute-0 nova_compute[187639]: 2026-02-23 11:06:42.353 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:06:42 compute-0 nova_compute[187639]: 2026-02-23 11:06:42.354 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquired lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:06:42 compute-0 nova_compute[187639]: 2026-02-23 11:06:42.354 187643 DEBUG nova.network.neutron [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:06:42 compute-0 nova_compute[187639]: 2026-02-23 11:06:42.456 187643 DEBUG nova.compute.manager [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received event network-changed-3dc893be-d39b-4ea7-baae-7b504533898e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:06:42 compute-0 nova_compute[187639]: 2026-02-23 11:06:42.456 187643 DEBUG nova.compute.manager [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Refreshing instance network info cache due to event network-changed-3dc893be-d39b-4ea7-baae-7b504533898e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:06:42 compute-0 nova_compute[187639]: 2026-02-23 11:06:42.457 187643 DEBUG oslo_concurrency.lockutils [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:06:43 compute-0 nova_compute[187639]: 2026-02-23 11:06:43.120 187643 DEBUG nova.network.neutron [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.405 187643 DEBUG nova.network.neutron [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Updating instance_info_cache with network_info: [{"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.437 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Releasing lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.438 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Instance network_info: |[{"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.439 187643 DEBUG oslo_concurrency.lockutils [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.439 187643 DEBUG nova.network.neutron [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Refreshing network info cache for port 3dc893be-d39b-4ea7-baae-7b504533898e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.446 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Start _get_guest_xml network_info=[{"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.452 187643 WARNING nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.460 187643 DEBUG nova.virt.libvirt.host [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.461 187643 DEBUG nova.virt.libvirt.host [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.464 187643 DEBUG nova.virt.libvirt.host [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.466 187643 DEBUG nova.virt.libvirt.host [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.468 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.468 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.469 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.469 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.470 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.470 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.471 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.471 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.472 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.473 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.473 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.473 187643 DEBUG nova.virt.hardware [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.479 187643 DEBUG nova.virt.libvirt.vif [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:06:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1484974806',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1484974806',id=14,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-wbslk2ev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:06:39Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=cf40a379-a79f-48d0-9b8d-15588edbccbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.480 187643 DEBUG nova.network.os_vif_util [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.481 187643 DEBUG nova.network.os_vif_util [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:11:8a,bridge_name='br-int',has_traffic_filtering=True,id=3dc893be-d39b-4ea7-baae-7b504533898e,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc893be-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.482 187643 DEBUG nova.objects.instance [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'pci_devices' on Instance uuid cf40a379-a79f-48d0-9b8d-15588edbccbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.512 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <uuid>cf40a379-a79f-48d0-9b8d-15588edbccbe</uuid>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <name>instance-0000000e</name>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteStrategies-server-1484974806</nova:name>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:06:44</nova:creationTime>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:user uuid="48814d91aad6418f9d55fc9967ed0087">tempest-TestExecuteStrategies-126537390-project-member</nova:user>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:project uuid="5dfbb0ac693b4065ada17052ebb303dd">tempest-TestExecuteStrategies-126537390</nova:project>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         <nova:port uuid="3dc893be-d39b-4ea7-baae-7b504533898e">
Feb 23 11:06:44 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <system>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <entry name="serial">cf40a379-a79f-48d0-9b8d-15588edbccbe</entry>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <entry name="uuid">cf40a379-a79f-48d0-9b8d-15588edbccbe</entry>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </system>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <os>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   </os>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <features>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   </features>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk.config"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:17:11:8a"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <target dev="tap3dc893be-d3"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/console.log" append="off"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <video>
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </video>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:06:44 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:06:44 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:06:44 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:06:44 compute-0 nova_compute[187639]: </domain>
Feb 23 11:06:44 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.514 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Preparing to wait for external event network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.514 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.515 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.515 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.516 187643 DEBUG nova.virt.libvirt.vif [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:06:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1484974806',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1484974806',id=14,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-wbslk2ev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:06:39Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=cf40a379-a79f-48d0-9b8d-15588edbccbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.517 187643 DEBUG nova.network.os_vif_util [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.518 187643 DEBUG nova.network.os_vif_util [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:11:8a,bridge_name='br-int',has_traffic_filtering=True,id=3dc893be-d39b-4ea7-baae-7b504533898e,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc893be-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.519 187643 DEBUG os_vif [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:11:8a,bridge_name='br-int',has_traffic_filtering=True,id=3dc893be-d39b-4ea7-baae-7b504533898e,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc893be-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.519 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.520 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.521 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.525 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.526 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3dc893be-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.527 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3dc893be-d3, col_values=(('external_ids', {'iface-id': '3dc893be-d39b-4ea7-baae-7b504533898e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:11:8a', 'vm-uuid': 'cf40a379-a79f-48d0-9b8d-15588edbccbe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.529 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:44 compute-0 NetworkManager[57207]: <info>  [1771844804.5307] manager: (tap3dc893be-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.532 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.537 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.538 187643 INFO os_vif [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:11:8a,bridge_name='br-int',has_traffic_filtering=True,id=3dc893be-d39b-4ea7-baae-7b504533898e,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc893be-d3')
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.609 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.609 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.610 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No VIF found with MAC fa:16:3e:17:11:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:06:44 compute-0 nova_compute[187639]: 2026-02-23 11:06:44.611 187643 INFO nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Using config drive
Feb 23 11:06:45 compute-0 sshd-session[212195]: Invalid user admin from 143.198.30.3 port 59244
Feb 23 11:06:45 compute-0 sshd-session[212195]: Connection closed by invalid user admin 143.198.30.3 port 59244 [preauth]
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.428 187643 INFO nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Creating config drive at /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk.config
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.433 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpp4khgl8m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.550 187643 DEBUG oslo_concurrency.processutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpp4khgl8m" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:06:45 compute-0 kernel: tap3dc893be-d3: entered promiscuous mode
Feb 23 11:06:45 compute-0 NetworkManager[57207]: <info>  [1771844805.6097] manager: (tap3dc893be-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.611 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:45 compute-0 ovn_controller[97601]: 2026-02-23T11:06:45Z|00099|binding|INFO|Claiming lport 3dc893be-d39b-4ea7-baae-7b504533898e for this chassis.
Feb 23 11:06:45 compute-0 ovn_controller[97601]: 2026-02-23T11:06:45Z|00100|binding|INFO|3dc893be-d39b-4ea7-baae-7b504533898e: Claiming fa:16:3e:17:11:8a 10.100.0.8
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.633 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:11:8a 10.100.0.8'], port_security=['fa:16:3e:17:11:8a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'cf40a379-a79f-48d0-9b8d-15588edbccbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=3dc893be-d39b-4ea7-baae-7b504533898e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:06:45 compute-0 ovn_controller[97601]: 2026-02-23T11:06:45Z|00101|binding|INFO|Setting lport 3dc893be-d39b-4ea7-baae-7b504533898e ovn-installed in OVS
Feb 23 11:06:45 compute-0 ovn_controller[97601]: 2026-02-23T11:06:45Z|00102|binding|INFO|Setting lport 3dc893be-d39b-4ea7-baae-7b504533898e up in Southbound
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.637 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 3dc893be-d39b-4ea7-baae-7b504533898e in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.637 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.641 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.651 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7f59f5-305f-4379-80b6-2256e181b832]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.652 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b12da8d-31 in ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:06:45 compute-0 systemd-udevd[212215]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.653 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b12da8d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.653 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[541de700-80e6-4185-90cb-ec51a4ae7b31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.655 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[79632c20-bbd0-44fe-bd25-f0dd417240b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 systemd-machined[156970]: New machine qemu-9-instance-0000000e.
Feb 23 11:06:45 compute-0 NetworkManager[57207]: <info>  [1771844805.6631] device (tap3dc893be-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:06:45 compute-0 NetworkManager[57207]: <info>  [1771844805.6637] device (tap3dc893be-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.663 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[b17b72b6-adda-49f0-abed-750cd1bb5bc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000e.
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.676 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[4a42f420-2e66-45a3-9be2-79ab6a851307]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.700 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[b27e6d31-e2aa-442c-9b82-ad62a4ba68c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 NetworkManager[57207]: <info>  [1771844805.7059] manager: (tap4b12da8d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.705 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[007a6301-1cc5-4c06-929f-f2612ea86743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 systemd-udevd[212219]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.726 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[ec264a54-dadc-417b-99b1-ba297404c451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.729 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[8b25dc4d-83b7-409d-ae35-86c634639400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 NetworkManager[57207]: <info>  [1771844805.7477] device (tap4b12da8d-30): carrier: link connected
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.751 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4a9fc6-d141-4bff-aa57-954cb22f0899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.764 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[576ff582-2c72-4aca-9cf7-38ee98fb689b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394991, 'reachable_time': 37377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212248, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.776 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[829433be-64c8-407b-b38a-62c786a4bc91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8a8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 394991, 'tstamp': 394991}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212249, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.791 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b4753076-6b38-4957-994a-7cdd1bdf10f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394991, 'reachable_time': 37377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212250, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.812 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[822a6c2e-ea63-4cf9-b47e-ac769f3f7268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.856 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[348f69b5-3c06-4083-bbc3-4e631d261496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.857 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.857 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.858 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.859 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:45 compute-0 NetworkManager[57207]: <info>  [1771844805.8603] manager: (tap4b12da8d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Feb 23 11:06:45 compute-0 kernel: tap4b12da8d-30: entered promiscuous mode
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.862 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.863 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.864 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:45 compute-0 ovn_controller[97601]: 2026-02-23T11:06:45Z|00103|binding|INFO|Releasing lport 586378da-906d-4768-bab7-0954450c4a57 from this chassis (sb_readonly=0)
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.872 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:45 compute-0 nova_compute[187639]: 2026-02-23 11:06:45.873 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.873 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.874 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[77f0e645-5d33-450d-adc4-ac253b11e05b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.875 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:06:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:45.877 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'env', 'PROCESS_TAG=haproxy-4b12da8d-3150-4d44-b948-8d49ddadedef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b12da8d-3150-4d44-b948-8d49ddadedef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.113 187643 DEBUG nova.compute.manager [req-3c41c516-963d-4ad0-abea-aca2132388b0 req-92563710-2c64-4ca3-ada8-4ce1d7ae61e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received event network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.114 187643 DEBUG oslo_concurrency.lockutils [req-3c41c516-963d-4ad0-abea-aca2132388b0 req-92563710-2c64-4ca3-ada8-4ce1d7ae61e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.115 187643 DEBUG oslo_concurrency.lockutils [req-3c41c516-963d-4ad0-abea-aca2132388b0 req-92563710-2c64-4ca3-ada8-4ce1d7ae61e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.115 187643 DEBUG oslo_concurrency.lockutils [req-3c41c516-963d-4ad0-abea-aca2132388b0 req-92563710-2c64-4ca3-ada8-4ce1d7ae61e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.116 187643 DEBUG nova.compute.manager [req-3c41c516-963d-4ad0-abea-aca2132388b0 req-92563710-2c64-4ca3-ada8-4ce1d7ae61e5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Processing event network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:06:46 compute-0 podman[212280]: 2026-02-23 11:06:46.222804137 +0000 UTC m=+0.051560087 container create c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.249 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:46.250 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:06:46 compute-0 systemd[1]: Started libpod-conmon-c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4.scope.
Feb 23 11:06:46 compute-0 podman[212280]: 2026-02-23 11:06:46.194641521 +0000 UTC m=+0.023397531 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:06:46 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:06:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b9397ecf10d89e011e9f0f2a95beb035a6bc6fcdf7edba9161010f7db7901ed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:06:46 compute-0 podman[212280]: 2026-02-23 11:06:46.308745474 +0000 UTC m=+0.137501494 container init c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:06:46 compute-0 podman[212280]: 2026-02-23 11:06:46.312662228 +0000 UTC m=+0.141418218 container start c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 11:06:46 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212297]: [NOTICE]   (212305) : New worker (212307) forked
Feb 23 11:06:46 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212297]: [NOTICE]   (212305) : Loading success.
Feb 23 11:06:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:46.371 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.381 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844806.381182, cf40a379-a79f-48d0-9b8d-15588edbccbe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.382 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] VM Started (Lifecycle Event)
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.383 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.387 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.390 187643 INFO nova.virt.libvirt.driver [-] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Instance spawned successfully.
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.390 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.409 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.414 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.417 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.417 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.417 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.418 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.418 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.418 187643 DEBUG nova.virt.libvirt.driver [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.451 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.452 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844806.3813617, cf40a379-a79f-48d0-9b8d-15588edbccbe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.452 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] VM Paused (Lifecycle Event)
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.476 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.481 187643 INFO nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Took 7.35 seconds to spawn the instance on the hypervisor.
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.481 187643 DEBUG nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.483 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844806.3861148, cf40a379-a79f-48d0-9b8d-15588edbccbe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.483 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] VM Resumed (Lifecycle Event)
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.509 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.511 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.539 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.549 187643 INFO nova.compute.manager [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Took 7.86 seconds to build instance.
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.567 187643 DEBUG oslo_concurrency.lockutils [None req-7b68cd22-2672-4098-9a0b-fadfc6936e38 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.588 187643 DEBUG nova.network.neutron [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Updated VIF entry in instance network info cache for port 3dc893be-d39b-4ea7-baae-7b504533898e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.588 187643 DEBUG nova.network.neutron [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Updating instance_info_cache with network_info: [{"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.604 187643 DEBUG oslo_concurrency.lockutils [req-6c89f20c-02e6-473f-b80b-daf0350d5038 req-8ef00860-e57b-45fb-8de6-8618c42d649a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:06:46 compute-0 nova_compute[187639]: 2026-02-23 11:06:46.607 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:48 compute-0 nova_compute[187639]: 2026-02-23 11:06:48.227 187643 DEBUG nova.compute.manager [req-651c9229-b1e7-4c30-ace3-238c943846ac req-b1922bb9-8e48-43fe-91ce-fd6f47751ef4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received event network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:06:48 compute-0 nova_compute[187639]: 2026-02-23 11:06:48.228 187643 DEBUG oslo_concurrency.lockutils [req-651c9229-b1e7-4c30-ace3-238c943846ac req-b1922bb9-8e48-43fe-91ce-fd6f47751ef4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:06:48 compute-0 nova_compute[187639]: 2026-02-23 11:06:48.228 187643 DEBUG oslo_concurrency.lockutils [req-651c9229-b1e7-4c30-ace3-238c943846ac req-b1922bb9-8e48-43fe-91ce-fd6f47751ef4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:06:48 compute-0 nova_compute[187639]: 2026-02-23 11:06:48.229 187643 DEBUG oslo_concurrency.lockutils [req-651c9229-b1e7-4c30-ace3-238c943846ac req-b1922bb9-8e48-43fe-91ce-fd6f47751ef4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:06:48 compute-0 nova_compute[187639]: 2026-02-23 11:06:48.229 187643 DEBUG nova.compute.manager [req-651c9229-b1e7-4c30-ace3-238c943846ac req-b1922bb9-8e48-43fe-91ce-fd6f47751ef4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] No waiting events found dispatching network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:06:48 compute-0 nova_compute[187639]: 2026-02-23 11:06:48.230 187643 WARNING nova.compute.manager [req-651c9229-b1e7-4c30-ace3-238c943846ac req-b1922bb9-8e48-43fe-91ce-fd6f47751ef4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received unexpected event network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e for instance with vm_state active and task_state None.
Feb 23 11:06:48 compute-0 podman[212317]: 2026-02-23 11:06:48.841543379 +0000 UTC m=+0.049372729 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 11:06:49 compute-0 nova_compute[187639]: 2026-02-23 11:06:49.528 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:51 compute-0 nova_compute[187639]: 2026-02-23 11:06:51.610 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:53 compute-0 podman[212338]: 2026-02-23 11:06:53.891961005 +0000 UTC m=+0.089043040 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:06:54 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:06:54.373 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:06:54 compute-0 nova_compute[187639]: 2026-02-23 11:06:54.530 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:56 compute-0 nova_compute[187639]: 2026-02-23 11:06:56.612 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:58 compute-0 nova_compute[187639]: 2026-02-23 11:06:58.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:58 compute-0 nova_compute[187639]: 2026-02-23 11:06:58.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:58 compute-0 nova_compute[187639]: 2026-02-23 11:06:58.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:06:58 compute-0 podman[212381]: 2026-02-23 11:06:58.875399568 +0000 UTC m=+0.084857320 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, architecture=x86_64)
Feb 23 11:06:59 compute-0 ovn_controller[97601]: 2026-02-23T11:06:59Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:11:8a 10.100.0.8
Feb 23 11:06:59 compute-0 ovn_controller[97601]: 2026-02-23T11:06:59Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:11:8a 10.100.0.8
Feb 23 11:06:59 compute-0 nova_compute[187639]: 2026-02-23 11:06:59.533 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:06:59 compute-0 nova_compute[187639]: 2026-02-23 11:06:59.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:06:59 compute-0 podman[197002]: time="2026-02-23T11:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:06:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:06:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Feb 23 11:07:00 compute-0 nova_compute[187639]: 2026-02-23 11:07:00.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:00 compute-0 nova_compute[187639]: 2026-02-23 11:07:00.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:07:00 compute-0 nova_compute[187639]: 2026-02-23 11:07:00.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:07:00 compute-0 nova_compute[187639]: 2026-02-23 11:07:00.989 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:07:00 compute-0 nova_compute[187639]: 2026-02-23 11:07:00.989 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:07:00 compute-0 nova_compute[187639]: 2026-02-23 11:07:00.990 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:07:00 compute-0 nova_compute[187639]: 2026-02-23 11:07:00.990 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid cf40a379-a79f-48d0-9b8d-15588edbccbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:07:01 compute-0 openstack_network_exporter[199919]: ERROR   11:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:07:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:07:01 compute-0 openstack_network_exporter[199919]: ERROR   11:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:07:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:07:01 compute-0 nova_compute[187639]: 2026-02-23 11:07:01.613 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.219 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Updating instance_info_cache with network_info: [{"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.349 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-cf40a379-a79f-48d0-9b8d-15588edbccbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.350 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.351 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.351 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.535 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.715 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.715 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.715 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.716 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.797 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.878 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.879 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:04 compute-0 nova_compute[187639]: 2026-02-23 11:07:04.954 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.082 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.083 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5631MB free_disk=73.17671966552734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.083 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.084 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.171 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance cf40a379-a79f-48d0-9b8d-15588edbccbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.171 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.171 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.231 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.245 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.273 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:07:05 compute-0 nova_compute[187639]: 2026-02-23 11:07:05.274 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:06 compute-0 nova_compute[187639]: 2026-02-23 11:07:06.615 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:08 compute-0 nova_compute[187639]: 2026-02-23 11:07:08.275 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:09 compute-0 nova_compute[187639]: 2026-02-23 11:07:09.536 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:11 compute-0 nova_compute[187639]: 2026-02-23 11:07:11.617 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:11 compute-0 podman[212411]: 2026-02-23 11:07:11.852908822 +0000 UTC m=+0.047059107 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 11:07:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:12.651 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:12.651 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:12.652 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:14 compute-0 nova_compute[187639]: 2026-02-23 11:07:14.520 187643 DEBUG nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Creating tmpfile /var/lib/nova/instances/tmp2cqm4ozp to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 23 11:07:14 compute-0 nova_compute[187639]: 2026-02-23 11:07:14.521 187643 DEBUG nova.compute.manager [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2cqm4ozp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 23 11:07:14 compute-0 nova_compute[187639]: 2026-02-23 11:07:14.539 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:15 compute-0 ovn_controller[97601]: 2026-02-23T11:07:15Z|00104|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 23 11:07:15 compute-0 nova_compute[187639]: 2026-02-23 11:07:15.788 187643 DEBUG nova.compute.manager [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2cqm4ozp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c0d1c646-81a3-4cea-8fc4-9a465a7b39ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 23 11:07:15 compute-0 nova_compute[187639]: 2026-02-23 11:07:15.834 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:07:15 compute-0 nova_compute[187639]: 2026-02-23 11:07:15.835 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:07:15 compute-0 nova_compute[187639]: 2026-02-23 11:07:15.835 187643 DEBUG nova.network.neutron [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:07:16 compute-0 nova_compute[187639]: 2026-02-23 11:07:16.618 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.241 187643 DEBUG nova.network.neutron [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Updating instance_info_cache with network_info: [{"id": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "address": "fa:16:3e:e9:87:53", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdec606c4-ff", "ovs_interfaceid": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.262 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.263 187643 DEBUG nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2cqm4ozp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c0d1c646-81a3-4cea-8fc4-9a465a7b39ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.264 187643 DEBUG nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Creating instance directory: /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.264 187643 DEBUG nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Creating disk.info with the contents: {'/var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk': 'qcow2', '/var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.264 187643 DEBUG nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.265 187643 DEBUG nova.objects.instance [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'trusted_certs' on Instance uuid c0d1c646-81a3-4cea-8fc4-9a465a7b39ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.296 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.339 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.340 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.340 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.350 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.424 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.425 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.453 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.454 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.455 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.520 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.522 187643 DEBUG nova.virt.disk.api [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Checking if we can resize image /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.523 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.580 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.581 187643 DEBUG nova.virt.disk.api [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Cannot resize image /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.581 187643 DEBUG nova.objects.instance [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid c0d1c646-81a3-4cea-8fc4-9a465a7b39ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.593 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.614 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk.config 485376" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.616 187643 DEBUG nova.virt.libvirt.volume.remotefs [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk.config to /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 23 11:07:17 compute-0 nova_compute[187639]: 2026-02-23 11:07:17.617 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk.config /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:07:17 compute-0 sshd-session[212446]: Invalid user admin from 143.198.30.3 port 60086
Feb 23 11:07:17 compute-0 sshd-session[212446]: Connection closed by invalid user admin 143.198.30.3 port 60086 [preauth]
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.034 187643 DEBUG oslo_concurrency.processutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba/disk.config /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.036 187643 DEBUG nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.037 187643 DEBUG nova.virt.libvirt.vif [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:06:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1058536103',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1058536103',id=13,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:06:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-q6b1ljj6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:06:28Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=c0d1c646-81a3-4cea-8fc4-9a465a7b39ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "address": "fa:16:3e:e9:87:53", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdec606c4-ff", "ovs_interfaceid": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.038 187643 DEBUG nova.network.os_vif_util [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "address": "fa:16:3e:e9:87:53", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdec606c4-ff", "ovs_interfaceid": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.039 187643 DEBUG nova.network.os_vif_util [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:87:53,bridge_name='br-int',has_traffic_filtering=True,id=dec606c4-ff4f-4dd1-86f2-d47e942397df,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdec606c4-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.039 187643 DEBUG os_vif [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:87:53,bridge_name='br-int',has_traffic_filtering=True,id=dec606c4-ff4f-4dd1-86f2-d47e942397df,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdec606c4-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.040 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.040 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.041 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.043 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.043 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdec606c4-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.044 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdec606c4-ff, col_values=(('external_ids', {'iface-id': 'dec606c4-ff4f-4dd1-86f2-d47e942397df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:87:53', 'vm-uuid': 'c0d1c646-81a3-4cea-8fc4-9a465a7b39ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.078 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:18 compute-0 NetworkManager[57207]: <info>  [1771844838.0796] manager: (tapdec606c4-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.081 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.087 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.088 187643 INFO os_vif [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:87:53,bridge_name='br-int',has_traffic_filtering=True,id=dec606c4-ff4f-4dd1-86f2-d47e942397df,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdec606c4-ff')
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.088 187643 DEBUG nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 23 11:07:18 compute-0 nova_compute[187639]: 2026-02-23 11:07:18.089 187643 DEBUG nova.compute.manager [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2cqm4ozp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c0d1c646-81a3-4cea-8fc4-9a465a7b39ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 23 11:07:19 compute-0 podman[212460]: 2026-02-23 11:07:19.878662575 +0000 UTC m=+0.064752586 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 11:07:20 compute-0 sshd-session[212480]: Connection closed by authenticating user root 165.227.79.48 port 34448 [preauth]
Feb 23 11:07:21 compute-0 nova_compute[187639]: 2026-02-23 11:07:21.657 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:22 compute-0 nova_compute[187639]: 2026-02-23 11:07:22.069 187643 DEBUG nova.network.neutron [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Port dec606c4-ff4f-4dd1-86f2-d47e942397df updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 23 11:07:22 compute-0 nova_compute[187639]: 2026-02-23 11:07:22.070 187643 DEBUG nova.compute.manager [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2cqm4ozp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c0d1c646-81a3-4cea-8fc4-9a465a7b39ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 23 11:07:22 compute-0 kernel: tapdec606c4-ff: entered promiscuous mode
Feb 23 11:07:22 compute-0 NetworkManager[57207]: <info>  [1771844842.2983] manager: (tapdec606c4-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Feb 23 11:07:22 compute-0 nova_compute[187639]: 2026-02-23 11:07:22.299 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:22 compute-0 ovn_controller[97601]: 2026-02-23T11:07:22Z|00105|binding|INFO|Claiming lport dec606c4-ff4f-4dd1-86f2-d47e942397df for this additional chassis.
Feb 23 11:07:22 compute-0 ovn_controller[97601]: 2026-02-23T11:07:22Z|00106|binding|INFO|dec606c4-ff4f-4dd1-86f2-d47e942397df: Claiming fa:16:3e:e9:87:53 10.100.0.7
Feb 23 11:07:22 compute-0 nova_compute[187639]: 2026-02-23 11:07:22.305 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:22 compute-0 ovn_controller[97601]: 2026-02-23T11:07:22Z|00107|binding|INFO|Setting lport dec606c4-ff4f-4dd1-86f2-d47e942397df ovn-installed in OVS
Feb 23 11:07:22 compute-0 nova_compute[187639]: 2026-02-23 11:07:22.308 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:22 compute-0 nova_compute[187639]: 2026-02-23 11:07:22.312 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:22 compute-0 systemd-machined[156970]: New machine qemu-10-instance-0000000d.
Feb 23 11:07:22 compute-0 systemd-udevd[212497]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:07:22 compute-0 NetworkManager[57207]: <info>  [1771844842.3418] device (tapdec606c4-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:07:22 compute-0 NetworkManager[57207]: <info>  [1771844842.3426] device (tapdec606c4-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:07:22 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000d.
Feb 23 11:07:23 compute-0 nova_compute[187639]: 2026-02-23 11:07:23.079 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:23 compute-0 nova_compute[187639]: 2026-02-23 11:07:23.724 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844843.7243974, c0d1c646-81a3-4cea-8fc4-9a465a7b39ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:07:23 compute-0 nova_compute[187639]: 2026-02-23 11:07:23.725 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] VM Started (Lifecycle Event)
Feb 23 11:07:23 compute-0 nova_compute[187639]: 2026-02-23 11:07:23.758 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:07:24 compute-0 nova_compute[187639]: 2026-02-23 11:07:24.318 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844844.318189, c0d1c646-81a3-4cea-8fc4-9a465a7b39ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:07:24 compute-0 nova_compute[187639]: 2026-02-23 11:07:24.319 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] VM Resumed (Lifecycle Event)
Feb 23 11:07:24 compute-0 nova_compute[187639]: 2026-02-23 11:07:24.340 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:07:24 compute-0 nova_compute[187639]: 2026-02-23 11:07:24.342 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:07:24 compute-0 nova_compute[187639]: 2026-02-23 11:07:24.377 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 23 11:07:24 compute-0 podman[212528]: 2026-02-23 11:07:24.914874061 +0000 UTC m=+0.105723861 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 11:07:25 compute-0 ovn_controller[97601]: 2026-02-23T11:07:25Z|00108|binding|INFO|Claiming lport dec606c4-ff4f-4dd1-86f2-d47e942397df for this chassis.
Feb 23 11:07:25 compute-0 ovn_controller[97601]: 2026-02-23T11:07:25Z|00109|binding|INFO|dec606c4-ff4f-4dd1-86f2-d47e942397df: Claiming fa:16:3e:e9:87:53 10.100.0.7
Feb 23 11:07:25 compute-0 ovn_controller[97601]: 2026-02-23T11:07:25Z|00110|binding|INFO|Setting lport dec606c4-ff4f-4dd1-86f2-d47e942397df up in Southbound
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.263 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:87:53 10.100.0.7'], port_security=['fa:16:3e:e9:87:53 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c0d1c646-81a3-4cea-8fc4-9a465a7b39ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '11', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=dec606c4-ff4f-4dd1-86f2-d47e942397df) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.265 106968 INFO neutron.agent.ovn.metadata.agent [-] Port dec606c4-ff4f-4dd1-86f2-d47e942397df in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.268 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.287 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[17f779b2-bd2d-4ee5-a7bb-85f8a1a1dfe4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.312 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[2341d962-c8d9-4eac-b7b5-acfc1de1ec6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.318 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[3931ae28-c5c2-432b-9df7-335fb58ebf77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.342 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[f445ed47-6dab-4947-a47c-def05056266f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.360 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f221fb90-22c7-4d1b-8985-c0811f66d04e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394991, 'reachable_time': 37377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212561, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.380 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[4499882f-b46b-42ef-8771-dd934526cf80]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395000, 'tstamp': 395000}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212562, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395001, 'tstamp': 395001}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212562, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.382 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:25 compute-0 nova_compute[187639]: 2026-02-23 11:07:25.427 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:25 compute-0 nova_compute[187639]: 2026-02-23 11:07:25.429 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.430 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.430 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.431 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:25.432 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:07:25 compute-0 nova_compute[187639]: 2026-02-23 11:07:25.529 187643 INFO nova.compute.manager [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Post operation of migration started
Feb 23 11:07:25 compute-0 nova_compute[187639]: 2026-02-23 11:07:25.833 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:07:25 compute-0 nova_compute[187639]: 2026-02-23 11:07:25.833 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:07:25 compute-0 nova_compute[187639]: 2026-02-23 11:07:25.834 187643 DEBUG nova.network.neutron [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:07:26 compute-0 nova_compute[187639]: 2026-02-23 11:07:26.660 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:27 compute-0 nova_compute[187639]: 2026-02-23 11:07:27.236 187643 DEBUG nova.network.neutron [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Updating instance_info_cache with network_info: [{"id": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "address": "fa:16:3e:e9:87:53", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdec606c4-ff", "ovs_interfaceid": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:07:27 compute-0 nova_compute[187639]: 2026-02-23 11:07:27.259 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:07:27 compute-0 nova_compute[187639]: 2026-02-23 11:07:27.273 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:27 compute-0 nova_compute[187639]: 2026-02-23 11:07:27.273 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:27 compute-0 nova_compute[187639]: 2026-02-23 11:07:27.274 187643 DEBUG oslo_concurrency.lockutils [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:27 compute-0 nova_compute[187639]: 2026-02-23 11:07:27.279 187643 INFO nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 23 11:07:27 compute-0 virtqemud[186733]: Domain id=10 name='instance-0000000d' uuid=c0d1c646-81a3-4cea-8fc4-9a465a7b39ba is tainted: custom-monitor
Feb 23 11:07:28 compute-0 nova_compute[187639]: 2026-02-23 11:07:28.082 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:28 compute-0 nova_compute[187639]: 2026-02-23 11:07:28.284 187643 INFO nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 23 11:07:29 compute-0 nova_compute[187639]: 2026-02-23 11:07:29.289 187643 INFO nova.virt.libvirt.driver [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 23 11:07:29 compute-0 nova_compute[187639]: 2026-02-23 11:07:29.293 187643 DEBUG nova.compute.manager [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:07:29 compute-0 nova_compute[187639]: 2026-02-23 11:07:29.315 187643 DEBUG nova.objects.instance [None req-8c5552b0-9420-4f91-8046-9b3fded1ccc4 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 23 11:07:29 compute-0 podman[197002]: time="2026-02-23T11:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:07:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:07:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 23 11:07:29 compute-0 podman[212563]: 2026-02-23 11:07:29.850300499 +0000 UTC m=+0.047017587 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 23 11:07:31 compute-0 openstack_network_exporter[199919]: ERROR   11:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:07:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:07:31 compute-0 openstack_network_exporter[199919]: ERROR   11:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:07:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:07:31 compute-0 nova_compute[187639]: 2026-02-23 11:07:31.663 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:33 compute-0 nova_compute[187639]: 2026-02-23 11:07:33.085 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.504 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.504 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.505 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.505 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.505 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.506 187643 INFO nova.compute.manager [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Terminating instance
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.507 187643 DEBUG nova.compute.manager [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:07:35 compute-0 kernel: tap3dc893be-d3 (unregistering): left promiscuous mode
Feb 23 11:07:35 compute-0 NetworkManager[57207]: <info>  [1771844855.5513] device (tap3dc893be-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.556 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 ovn_controller[97601]: 2026-02-23T11:07:35Z|00111|binding|INFO|Releasing lport 3dc893be-d39b-4ea7-baae-7b504533898e from this chassis (sb_readonly=0)
Feb 23 11:07:35 compute-0 ovn_controller[97601]: 2026-02-23T11:07:35Z|00112|binding|INFO|Setting lport 3dc893be-d39b-4ea7-baae-7b504533898e down in Southbound
Feb 23 11:07:35 compute-0 ovn_controller[97601]: 2026-02-23T11:07:35Z|00113|binding|INFO|Removing iface tap3dc893be-d3 ovn-installed in OVS
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.564 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.565 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:11:8a 10.100.0.8'], port_security=['fa:16:3e:17:11:8a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'cf40a379-a79f-48d0-9b8d-15588edbccbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=3dc893be-d39b-4ea7-baae-7b504533898e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.566 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 3dc893be-d39b-4ea7-baae-7b504533898e in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.568 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.579 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ac7433-3f2d-4499-bc9f-857c9ba3d876]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:35 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Feb 23 11:07:35 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Consumed 13.096s CPU time.
Feb 23 11:07:35 compute-0 systemd-machined[156970]: Machine qemu-9-instance-0000000e terminated.
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.595 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[67ba62a6-ab8e-4d85-913f-24361029c196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.598 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[92722413-85b9-4166-9fa3-d662bb85d71a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.615 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[3e42c5d9-66f1-49b9-bcff-8395f0934119]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.629 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[224c9330-6290-4b66-b5b8-d76341553ffd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394991, 'reachable_time': 37377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212599, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.640 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb84d67-469a-4d5d-bb83-122097a0f39e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395000, 'tstamp': 395000}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212600, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395001, 'tstamp': 395001}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212600, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.642 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.643 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.647 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.647 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.647 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.648 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:35.648 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.753 187643 INFO nova.virt.libvirt.driver [-] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Instance destroyed successfully.
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.754 187643 DEBUG nova.objects.instance [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid cf40a379-a79f-48d0-9b8d-15588edbccbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.766 187643 DEBUG nova.virt.libvirt.vif [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:06:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1484974806',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1484974806',id=14,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:06:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-wbslk2ev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:06:46Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=cf40a379-a79f-48d0-9b8d-15588edbccbe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.766 187643 DEBUG nova.network.os_vif_util [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "3dc893be-d39b-4ea7-baae-7b504533898e", "address": "fa:16:3e:17:11:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc893be-d3", "ovs_interfaceid": "3dc893be-d39b-4ea7-baae-7b504533898e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.767 187643 DEBUG nova.network.os_vif_util [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:11:8a,bridge_name='br-int',has_traffic_filtering=True,id=3dc893be-d39b-4ea7-baae-7b504533898e,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc893be-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.767 187643 DEBUG os_vif [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:11:8a,bridge_name='br-int',has_traffic_filtering=True,id=3dc893be-d39b-4ea7-baae-7b504533898e,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc893be-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.768 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.768 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3dc893be-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.769 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.770 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.772 187643 INFO os_vif [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:11:8a,bridge_name='br-int',has_traffic_filtering=True,id=3dc893be-d39b-4ea7-baae-7b504533898e,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc893be-d3')
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.772 187643 INFO nova.virt.libvirt.driver [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Deleting instance files /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe_del
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.773 187643 INFO nova.virt.libvirt.driver [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Deletion of /var/lib/nova/instances/cf40a379-a79f-48d0-9b8d-15588edbccbe_del complete
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.837 187643 INFO nova.compute.manager [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.838 187643 DEBUG oslo.service.loopingcall [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.838 187643 DEBUG nova.compute.manager [-] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:07:35 compute-0 nova_compute[187639]: 2026-02-23 11:07:35.838 187643 DEBUG nova.network.neutron [-] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.107 187643 DEBUG nova.compute.manager [req-d3cc2890-08fa-4829-a9a7-f68ecd4ca00f req-cc90977a-0a40-4062-9e50-a497ef9ab21d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received event network-vif-unplugged-3dc893be-d39b-4ea7-baae-7b504533898e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.108 187643 DEBUG oslo_concurrency.lockutils [req-d3cc2890-08fa-4829-a9a7-f68ecd4ca00f req-cc90977a-0a40-4062-9e50-a497ef9ab21d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.108 187643 DEBUG oslo_concurrency.lockutils [req-d3cc2890-08fa-4829-a9a7-f68ecd4ca00f req-cc90977a-0a40-4062-9e50-a497ef9ab21d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.108 187643 DEBUG oslo_concurrency.lockutils [req-d3cc2890-08fa-4829-a9a7-f68ecd4ca00f req-cc90977a-0a40-4062-9e50-a497ef9ab21d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.109 187643 DEBUG nova.compute.manager [req-d3cc2890-08fa-4829-a9a7-f68ecd4ca00f req-cc90977a-0a40-4062-9e50-a497ef9ab21d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] No waiting events found dispatching network-vif-unplugged-3dc893be-d39b-4ea7-baae-7b504533898e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.109 187643 DEBUG nova.compute.manager [req-d3cc2890-08fa-4829-a9a7-f68ecd4ca00f req-cc90977a-0a40-4062-9e50-a497ef9ab21d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received event network-vif-unplugged-3dc893be-d39b-4ea7-baae-7b504533898e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.537 187643 DEBUG nova.network.neutron [-] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.567 187643 INFO nova.compute.manager [-] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Took 0.73 seconds to deallocate network for instance.
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.631 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.632 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.708 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.739 187643 DEBUG nova.compute.provider_tree [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.754 187643 DEBUG nova.scheduler.client.report [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.777 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.806 187643 INFO nova.scheduler.client.report [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance cf40a379-a79f-48d0-9b8d-15588edbccbe
Feb 23 11:07:36 compute-0 nova_compute[187639]: 2026-02-23 11:07:36.880 187643 DEBUG oslo_concurrency.lockutils [None req-c5270e26-9021-429b-89a7-3697d2ea8660 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.671 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.672 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.672 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.672 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.673 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.674 187643 INFO nova.compute.manager [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Terminating instance
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.675 187643 DEBUG nova.compute.manager [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:07:37 compute-0 kernel: tapdec606c4-ff (unregistering): left promiscuous mode
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.697 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:37 compute-0 NetworkManager[57207]: <info>  [1771844857.6979] device (tapdec606c4-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:07:37 compute-0 ovn_controller[97601]: 2026-02-23T11:07:37Z|00114|binding|INFO|Releasing lport dec606c4-ff4f-4dd1-86f2-d47e942397df from this chassis (sb_readonly=0)
Feb 23 11:07:37 compute-0 ovn_controller[97601]: 2026-02-23T11:07:37Z|00115|binding|INFO|Setting lport dec606c4-ff4f-4dd1-86f2-d47e942397df down in Southbound
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.701 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:37 compute-0 ovn_controller[97601]: 2026-02-23T11:07:37Z|00116|binding|INFO|Removing iface tapdec606c4-ff ovn-installed in OVS
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.703 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.705 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:37.708 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:87:53 10.100.0.7'], port_security=['fa:16:3e:e9:87:53 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c0d1c646-81a3-4cea-8fc4-9a465a7b39ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '13', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=dec606c4-ff4f-4dd1-86f2-d47e942397df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:07:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:37.709 106968 INFO neutron.agent.ovn.metadata.agent [-] Port dec606c4-ff4f-4dd1-86f2-d47e942397df in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:07:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:37.711 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:07:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:37.711 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[66a47df1-80c8-4035-8213-f5691bd93050]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:37.712 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace which is not needed anymore
Feb 23 11:07:37 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Feb 23 11:07:37 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Consumed 2.183s CPU time.
Feb 23 11:07:37 compute-0 systemd-machined[156970]: Machine qemu-10-instance-0000000d terminated.
Feb 23 11:07:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212297]: [NOTICE]   (212305) : haproxy version is 2.8.14-c23fe91
Feb 23 11:07:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212297]: [NOTICE]   (212305) : path to executable is /usr/sbin/haproxy
Feb 23 11:07:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212297]: [WARNING]  (212305) : Exiting Master process...
Feb 23 11:07:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212297]: [ALERT]    (212305) : Current worker (212307) exited with code 143 (Terminated)
Feb 23 11:07:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212297]: [WARNING]  (212305) : All workers exited. Exiting... (0)
Feb 23 11:07:37 compute-0 systemd[1]: libpod-c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4.scope: Deactivated successfully.
Feb 23 11:07:37 compute-0 podman[212640]: 2026-02-23 11:07:37.828478703 +0000 UTC m=+0.050873198 container died c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216)
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.913 187643 INFO nova.virt.libvirt.driver [-] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Instance destroyed successfully.
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.913 187643 DEBUG nova.objects.instance [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid c0d1c646-81a3-4cea-8fc4-9a465a7b39ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.928 187643 DEBUG nova.virt.libvirt.vif [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-23T11:06:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1058536103',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1058536103',id=13,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:06:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-q6b1ljj6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:07:29Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=c0d1c646-81a3-4cea-8fc4-9a465a7b39ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "address": "fa:16:3e:e9:87:53", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdec606c4-ff", "ovs_interfaceid": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.929 187643 DEBUG nova.network.os_vif_util [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "address": "fa:16:3e:e9:87:53", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdec606c4-ff", "ovs_interfaceid": "dec606c4-ff4f-4dd1-86f2-d47e942397df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.929 187643 DEBUG nova.network.os_vif_util [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:87:53,bridge_name='br-int',has_traffic_filtering=True,id=dec606c4-ff4f-4dd1-86f2-d47e942397df,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdec606c4-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.929 187643 DEBUG os_vif [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:87:53,bridge_name='br-int',has_traffic_filtering=True,id=dec606c4-ff4f-4dd1-86f2-d47e942397df,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdec606c4-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:07:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4-userdata-shm.mount: Deactivated successfully.
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.930 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.931 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdec606c4-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.932 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.933 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-0b9397ecf10d89e011e9f0f2a95beb035a6bc6fcdf7edba9161010f7db7901ed-merged.mount: Deactivated successfully.
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.936 187643 INFO os_vif [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:87:53,bridge_name='br-int',has_traffic_filtering=True,id=dec606c4-ff4f-4dd1-86f2-d47e942397df,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdec606c4-ff')
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.937 187643 INFO nova.virt.libvirt.driver [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Deleting instance files /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba_del
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.937 187643 INFO nova.virt.libvirt.driver [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Deletion of /var/lib/nova/instances/c0d1c646-81a3-4cea-8fc4-9a465a7b39ba_del complete
Feb 23 11:07:37 compute-0 podman[212640]: 2026-02-23 11:07:37.939312668 +0000 UTC m=+0.161707163 container cleanup c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 11:07:37 compute-0 systemd[1]: libpod-conmon-c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4.scope: Deactivated successfully.
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.989 187643 INFO nova.compute.manager [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Took 0.31 seconds to destroy the instance on the hypervisor.
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.989 187643 DEBUG oslo.service.loopingcall [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.989 187643 DEBUG nova.compute.manager [-] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:07:37 compute-0 nova_compute[187639]: 2026-02-23 11:07:37.989 187643 DEBUG nova.network.neutron [-] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:07:37 compute-0 podman[212686]: 2026-02-23 11:07:37.995427044 +0000 UTC m=+0.035690606 container remove c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 23 11:07:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:37.998 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1e9e51b9-ebd6-48a1-a6a8-c0a39e8b1143]: (4, ('Mon Feb 23 11:07:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4)\nc72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4\nMon Feb 23 11:07:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (c72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4)\nc72bd3f5a15b19489903ef523a3470bcb087306ae55a47823d1cb342c54a70f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.000 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9cc14e-511d-4d04-9b3a-8e36ef8de5fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.001 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.002 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:38 compute-0 kernel: tap4b12da8d-30: left promiscuous mode
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.006 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.008 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc6daf7-9881-40e2-a878-0c0db4c3d60d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.028 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7e216c2e-cc58-41fb-8397-3eedbf2fcea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.030 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7e3b5811-3a4e-46a3-901f-228b9c05ff75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.042 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[41a16cc7-e744-49c5-be33-b4893eacccc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394986, 'reachable_time': 32918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212704, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.044 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:07:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b12da8d\x2d3150\x2d4d44\x2db948\x2d8d49ddadedef.mount: Deactivated successfully.
Feb 23 11:07:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:07:38.044 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[2e001d1c-c56d-437b-94df-afc46c0c9820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.211 187643 DEBUG nova.compute.manager [req-d59f6e21-b874-4e24-b6a0-cd44395aeb2a req-3ae07fd3-55a0-43a4-96ac-14d6f9dd7906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received event network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.212 187643 DEBUG oslo_concurrency.lockutils [req-d59f6e21-b874-4e24-b6a0-cd44395aeb2a req-3ae07fd3-55a0-43a4-96ac-14d6f9dd7906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.212 187643 DEBUG oslo_concurrency.lockutils [req-d59f6e21-b874-4e24-b6a0-cd44395aeb2a req-3ae07fd3-55a0-43a4-96ac-14d6f9dd7906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.212 187643 DEBUG oslo_concurrency.lockutils [req-d59f6e21-b874-4e24-b6a0-cd44395aeb2a req-3ae07fd3-55a0-43a4-96ac-14d6f9dd7906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "cf40a379-a79f-48d0-9b8d-15588edbccbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.213 187643 DEBUG nova.compute.manager [req-d59f6e21-b874-4e24-b6a0-cd44395aeb2a req-3ae07fd3-55a0-43a4-96ac-14d6f9dd7906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] No waiting events found dispatching network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.213 187643 WARNING nova.compute.manager [req-d59f6e21-b874-4e24-b6a0-cd44395aeb2a req-3ae07fd3-55a0-43a4-96ac-14d6f9dd7906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received unexpected event network-vif-plugged-3dc893be-d39b-4ea7-baae-7b504533898e for instance with vm_state deleted and task_state None.
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.213 187643 DEBUG nova.compute.manager [req-d59f6e21-b874-4e24-b6a0-cd44395aeb2a req-3ae07fd3-55a0-43a4-96ac-14d6f9dd7906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Received event network-vif-deleted-3dc893be-d39b-4ea7-baae-7b504533898e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.667 187643 DEBUG nova.network.neutron [-] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.692 187643 INFO nova.compute.manager [-] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Took 0.70 seconds to deallocate network for instance.
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.752 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.753 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.759 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.789 187643 INFO nova.scheduler.client.report [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance c0d1c646-81a3-4cea-8fc4-9a465a7b39ba
Feb 23 11:07:38 compute-0 nova_compute[187639]: 2026-02-23 11:07:38.844 187643 DEBUG oslo_concurrency.lockutils [None req-ec15a3f4-f800-4b44-8880-23c6c349c233 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.326 187643 DEBUG nova.compute.manager [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Received event network-vif-unplugged-dec606c4-ff4f-4dd1-86f2-d47e942397df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.327 187643 DEBUG oslo_concurrency.lockutils [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.328 187643 DEBUG oslo_concurrency.lockutils [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.328 187643 DEBUG oslo_concurrency.lockutils [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.329 187643 DEBUG nova.compute.manager [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] No waiting events found dispatching network-vif-unplugged-dec606c4-ff4f-4dd1-86f2-d47e942397df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.329 187643 WARNING nova.compute.manager [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Received unexpected event network-vif-unplugged-dec606c4-ff4f-4dd1-86f2-d47e942397df for instance with vm_state deleted and task_state None.
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.329 187643 DEBUG nova.compute.manager [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Received event network-vif-plugged-dec606c4-ff4f-4dd1-86f2-d47e942397df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.330 187643 DEBUG oslo_concurrency.lockutils [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.330 187643 DEBUG oslo_concurrency.lockutils [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.330 187643 DEBUG oslo_concurrency.lockutils [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "c0d1c646-81a3-4cea-8fc4-9a465a7b39ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.331 187643 DEBUG nova.compute.manager [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] No waiting events found dispatching network-vif-plugged-dec606c4-ff4f-4dd1-86f2-d47e942397df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.331 187643 WARNING nova.compute.manager [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Received unexpected event network-vif-plugged-dec606c4-ff4f-4dd1-86f2-d47e942397df for instance with vm_state deleted and task_state None.
Feb 23 11:07:40 compute-0 nova_compute[187639]: 2026-02-23 11:07:40.332 187643 DEBUG nova.compute.manager [req-2c4e9b3c-ad2e-4f99-aef5-73088473367a req-222b4ffe-3b45-40a7-a881-400dcb1271da 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Received event network-vif-deleted-dec606c4-ff4f-4dd1-86f2-d47e942397df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:07:41 compute-0 nova_compute[187639]: 2026-02-23 11:07:41.709 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:42 compute-0 podman[212705]: 2026-02-23 11:07:42.844583497 +0000 UTC m=+0.051762811 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:07:42 compute-0 nova_compute[187639]: 2026-02-23 11:07:42.933 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:46 compute-0 nova_compute[187639]: 2026-02-23 11:07:46.757 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:47 compute-0 nova_compute[187639]: 2026-02-23 11:07:47.978 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:48 compute-0 sshd-session[212730]: Invalid user admin from 143.198.30.3 port 52992
Feb 23 11:07:48 compute-0 sshd-session[212730]: Connection closed by invalid user admin 143.198.30.3 port 52992 [preauth]
Feb 23 11:07:50 compute-0 nova_compute[187639]: 2026-02-23 11:07:50.753 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844855.7514665, cf40a379-a79f-48d0-9b8d-15588edbccbe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:07:50 compute-0 nova_compute[187639]: 2026-02-23 11:07:50.754 187643 INFO nova.compute.manager [-] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] VM Stopped (Lifecycle Event)
Feb 23 11:07:50 compute-0 nova_compute[187639]: 2026-02-23 11:07:50.784 187643 DEBUG nova.compute.manager [None req-314e7942-7d17-4b11-8149-b2f2d8fbf5ca - - - - - -] [instance: cf40a379-a79f-48d0-9b8d-15588edbccbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:07:50 compute-0 podman[212732]: 2026-02-23 11:07:50.88847447 +0000 UTC m=+0.088511765 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:07:51 compute-0 nova_compute[187639]: 2026-02-23 11:07:51.760 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:52 compute-0 nova_compute[187639]: 2026-02-23 11:07:52.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:52 compute-0 nova_compute[187639]: 2026-02-23 11:07:52.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 11:07:52 compute-0 nova_compute[187639]: 2026-02-23 11:07:52.714 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 11:07:52 compute-0 nova_compute[187639]: 2026-02-23 11:07:52.911 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844857.91071, c0d1c646-81a3-4cea-8fc4-9a465a7b39ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:07:52 compute-0 nova_compute[187639]: 2026-02-23 11:07:52.912 187643 INFO nova.compute.manager [-] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] VM Stopped (Lifecycle Event)
Feb 23 11:07:52 compute-0 nova_compute[187639]: 2026-02-23 11:07:52.940 187643 DEBUG nova.compute.manager [None req-c357e20e-1c28-4a71-ad65-73dd1c26b350 - - - - - -] [instance: c0d1c646-81a3-4cea-8fc4-9a465a7b39ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:07:52 compute-0 nova_compute[187639]: 2026-02-23 11:07:52.981 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:55 compute-0 podman[212752]: 2026-02-23 11:07:55.890753517 +0000 UTC m=+0.095875800 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:07:56 compute-0 nova_compute[187639]: 2026-02-23 11:07:56.805 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:58 compute-0 nova_compute[187639]: 2026-02-23 11:07:58.027 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:07:58 compute-0 nova_compute[187639]: 2026-02-23 11:07:58.712 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:07:59 compute-0 podman[197002]: time="2026-02-23T11:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:07:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:07:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Feb 23 11:08:00 compute-0 nova_compute[187639]: 2026-02-23 11:08:00.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:00 compute-0 nova_compute[187639]: 2026-02-23 11:08:00.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:08:00 compute-0 podman[212778]: 2026-02-23 11:08:00.839514096 +0000 UTC m=+0.045924016 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7)
Feb 23 11:08:01 compute-0 openstack_network_exporter[199919]: ERROR   11:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:08:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:08:01 compute-0 openstack_network_exporter[199919]: ERROR   11:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:08:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:08:01 compute-0 nova_compute[187639]: 2026-02-23 11:08:01.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:01 compute-0 nova_compute[187639]: 2026-02-23 11:08:01.807 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:02 compute-0 nova_compute[187639]: 2026-02-23 11:08:02.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:02 compute-0 nova_compute[187639]: 2026-02-23 11:08:02.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:08:02 compute-0 nova_compute[187639]: 2026-02-23 11:08:02.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:08:02 compute-0 nova_compute[187639]: 2026-02-23 11:08:02.715 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:08:02 compute-0 nova_compute[187639]: 2026-02-23 11:08:02.715 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:03 compute-0 nova_compute[187639]: 2026-02-23 11:08:03.028 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:03 compute-0 nova_compute[187639]: 2026-02-23 11:08:03.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:03 compute-0 nova_compute[187639]: 2026-02-23 11:08:03.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.723 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.724 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.724 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.725 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.928 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.929 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.20555114746094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.929 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:04 compute-0 nova_compute[187639]: 2026-02-23 11:08:04.930 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:05 compute-0 nova_compute[187639]: 2026-02-23 11:08:05.078 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:08:05 compute-0 nova_compute[187639]: 2026-02-23 11:08:05.078 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:08:05 compute-0 nova_compute[187639]: 2026-02-23 11:08:05.102 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:08:05 compute-0 nova_compute[187639]: 2026-02-23 11:08:05.125 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:08:05 compute-0 nova_compute[187639]: 2026-02-23 11:08:05.161 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:08:05 compute-0 nova_compute[187639]: 2026-02-23 11:08:05.161 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:05 compute-0 sshd-session[212800]: Connection closed by authenticating user root 165.227.79.48 port 38056 [preauth]
Feb 23 11:08:06 compute-0 nova_compute[187639]: 2026-02-23 11:08:06.846 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:08 compute-0 nova_compute[187639]: 2026-02-23 11:08:08.067 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:08 compute-0 ovn_controller[97601]: 2026-02-23T11:08:08Z|00117|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 23 11:08:10 compute-0 nova_compute[187639]: 2026-02-23 11:08:10.162 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:10 compute-0 nova_compute[187639]: 2026-02-23 11:08:10.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:11 compute-0 nova_compute[187639]: 2026-02-23 11:08:11.718 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:11 compute-0 nova_compute[187639]: 2026-02-23 11:08:11.881 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:12.652 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:12.652 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:12.653 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:13 compute-0 nova_compute[187639]: 2026-02-23 11:08:13.069 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:13 compute-0 nova_compute[187639]: 2026-02-23 11:08:13.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:13 compute-0 nova_compute[187639]: 2026-02-23 11:08:13.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 11:08:13 compute-0 podman[212803]: 2026-02-23 11:08:13.839594577 +0000 UTC m=+0.044945200 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.236 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.236 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.261 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.355 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.355 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.364 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.365 187643 INFO nova.compute.claims [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.523 187643 DEBUG nova.compute.provider_tree [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.541 187643 DEBUG nova.scheduler.client.report [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.561 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.562 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.605 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.606 187643 DEBUG nova.network.neutron [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.625 187643 INFO nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.643 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.755 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.758 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.760 187643 INFO nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Creating image(s)
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.762 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "/var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.762 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.763 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.790 187643 DEBUG nova.policy [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48814d91aad6418f9d55fc9967ed0087', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.793 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.846 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.847 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.848 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.860 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.918 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.919 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.930 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.946 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.946 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.947 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.990 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.991 187643 DEBUG nova.virt.disk.api [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Checking if we can resize image /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:08:16 compute-0 nova_compute[187639]: 2026-02-23 11:08:16.991 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.035 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.036 187643 DEBUG nova.virt.disk.api [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Cannot resize image /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.036 187643 DEBUG nova.objects.instance [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'migration_context' on Instance uuid 5282a8f9-2db6-47c7-bbcb-061973c8b999 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.053 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.053 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Ensure instance console log exists: /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.054 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.054 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.054 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:17.712 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:08:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:17.713 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.713 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:17 compute-0 nova_compute[187639]: 2026-02-23 11:08:17.817 187643 DEBUG nova.network.neutron [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Successfully created port: b755f0b1-c231-4652-bc2c-71033179f14c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:08:18 compute-0 nova_compute[187639]: 2026-02-23 11:08:18.134 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:18 compute-0 nova_compute[187639]: 2026-02-23 11:08:18.886 187643 DEBUG nova.network.neutron [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Successfully updated port: b755f0b1-c231-4652-bc2c-71033179f14c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:08:18 compute-0 nova_compute[187639]: 2026-02-23 11:08:18.938 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:08:18 compute-0 nova_compute[187639]: 2026-02-23 11:08:18.938 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquired lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:08:18 compute-0 nova_compute[187639]: 2026-02-23 11:08:18.938 187643 DEBUG nova.network.neutron [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.021 187643 DEBUG nova.compute.manager [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-changed-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.022 187643 DEBUG nova.compute.manager [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Refreshing instance network info cache due to event network-changed-b755f0b1-c231-4652-bc2c-71033179f14c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.022 187643 DEBUG oslo_concurrency.lockutils [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.204 187643 DEBUG nova.network.neutron [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.751 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.771 187643 WARNING nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.771 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Triggering sync for uuid 5282a8f9-2db6-47c7-bbcb-061973c8b999 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.772 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.876 187643 DEBUG nova.network.neutron [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Updating instance_info_cache with network_info: [{"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.897 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Releasing lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.897 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Instance network_info: |[{"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.898 187643 DEBUG oslo_concurrency.lockutils [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.898 187643 DEBUG nova.network.neutron [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Refreshing network info cache for port b755f0b1-c231-4652-bc2c-71033179f14c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.901 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Start _get_guest_xml network_info=[{"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.907 187643 WARNING nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.913 187643 DEBUG nova.virt.libvirt.host [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.913 187643 DEBUG nova.virt.libvirt.host [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.920 187643 DEBUG nova.virt.libvirt.host [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.920 187643 DEBUG nova.virt.libvirt.host [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.921 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.922 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.922 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.923 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.923 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.923 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.923 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.924 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.924 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.924 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.924 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.925 187643 DEBUG nova.virt.hardware [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.929 187643 DEBUG nova.virt.libvirt.vif [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:08:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-935332689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-935332689',id=15,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-19c7ppbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:08:16Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=5282a8f9-2db6-47c7-bbcb-061973c8b999,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.929 187643 DEBUG nova.network.os_vif_util [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.930 187643 DEBUG nova.network.os_vif_util [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:41:2a,bridge_name='br-int',has_traffic_filtering=True,id=b755f0b1-c231-4652-bc2c-71033179f14c,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb755f0b1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.931 187643 DEBUG nova.objects.instance [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 5282a8f9-2db6-47c7-bbcb-061973c8b999 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.945 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <uuid>5282a8f9-2db6-47c7-bbcb-061973c8b999</uuid>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <name>instance-0000000f</name>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteStrategies-server-935332689</nova:name>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:08:19</nova:creationTime>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:user uuid="48814d91aad6418f9d55fc9967ed0087">tempest-TestExecuteStrategies-126537390-project-member</nova:user>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:project uuid="5dfbb0ac693b4065ada17052ebb303dd">tempest-TestExecuteStrategies-126537390</nova:project>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         <nova:port uuid="b755f0b1-c231-4652-bc2c-71033179f14c">
Feb 23 11:08:19 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <system>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <entry name="serial">5282a8f9-2db6-47c7-bbcb-061973c8b999</entry>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <entry name="uuid">5282a8f9-2db6-47c7-bbcb-061973c8b999</entry>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </system>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <os>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   </os>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <features>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   </features>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk.config"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:0e:41:2a"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <target dev="tapb755f0b1-c2"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/console.log" append="off"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <video>
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </video>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:08:19 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:08:19 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:08:19 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:08:19 compute-0 nova_compute[187639]: </domain>
Feb 23 11:08:19 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.946 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Preparing to wait for external event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.946 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.947 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.947 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.948 187643 DEBUG nova.virt.libvirt.vif [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:08:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-935332689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-935332689',id=15,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-19c7ppbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:08:16Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=5282a8f9-2db6-47c7-bbcb-061973c8b999,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.948 187643 DEBUG nova.network.os_vif_util [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.949 187643 DEBUG nova.network.os_vif_util [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:41:2a,bridge_name='br-int',has_traffic_filtering=True,id=b755f0b1-c231-4652-bc2c-71033179f14c,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb755f0b1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.949 187643 DEBUG os_vif [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:41:2a,bridge_name='br-int',has_traffic_filtering=True,id=b755f0b1-c231-4652-bc2c-71033179f14c,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb755f0b1-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.949 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.950 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.950 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.955 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.956 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb755f0b1-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.956 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb755f0b1-c2, col_values=(('external_ids', {'iface-id': 'b755f0b1-c231-4652-bc2c-71033179f14c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:41:2a', 'vm-uuid': '5282a8f9-2db6-47c7-bbcb-061973c8b999'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.958 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:19 compute-0 NetworkManager[57207]: <info>  [1771844899.9588] manager: (tapb755f0b1-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.959 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.965 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:19 compute-0 nova_compute[187639]: 2026-02-23 11:08:19.966 187643 INFO os_vif [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:41:2a,bridge_name='br-int',has_traffic_filtering=True,id=b755f0b1-c231-4652-bc2c-71033179f14c,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb755f0b1-c2')
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.009 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.010 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.010 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No VIF found with MAC fa:16:3e:0e:41:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.011 187643 INFO nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Using config drive
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.447 187643 INFO nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Creating config drive at /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk.config
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.455 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpyrb4z37v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.575 187643 DEBUG oslo_concurrency.processutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpyrb4z37v" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:08:20 compute-0 kernel: tapb755f0b1-c2: entered promiscuous mode
Feb 23 11:08:20 compute-0 NetworkManager[57207]: <info>  [1771844900.6348] manager: (tapb755f0b1-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.677 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:20 compute-0 ovn_controller[97601]: 2026-02-23T11:08:20Z|00118|binding|INFO|Claiming lport b755f0b1-c231-4652-bc2c-71033179f14c for this chassis.
Feb 23 11:08:20 compute-0 ovn_controller[97601]: 2026-02-23T11:08:20Z|00119|binding|INFO|b755f0b1-c231-4652-bc2c-71033179f14c: Claiming fa:16:3e:0e:41:2a 10.100.0.14
Feb 23 11:08:20 compute-0 ovn_controller[97601]: 2026-02-23T11:08:20Z|00120|binding|INFO|Setting lport b755f0b1-c231-4652-bc2c-71033179f14c ovn-installed in OVS
Feb 23 11:08:20 compute-0 ovn_controller[97601]: 2026-02-23T11:08:20Z|00121|binding|INFO|Setting lport b755f0b1-c231-4652-bc2c-71033179f14c up in Southbound
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.685 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:41:2a 10.100.0.14'], port_security=['fa:16:3e:0e:41:2a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5282a8f9-2db6-47c7-bbcb-061973c8b999', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b755f0b1-c231-4652-bc2c-71033179f14c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.687 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.687 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b755f0b1-c231-4652-bc2c-71033179f14c in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.691 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:08:20 compute-0 systemd-udevd[212860]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.702 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[086edbbb-9615-4ff9-9b3a-23a51f276ae4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.703 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b12da8d-31 in ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:08:20 compute-0 systemd-machined[156970]: New machine qemu-11-instance-0000000f.
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.707 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b12da8d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.707 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8059e185-a34f-4fe5-b1b8-517fdbdc0a1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.708 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5f86b033-3be7-45b2-8858-3763e463abb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 NetworkManager[57207]: <info>  [1771844900.7118] device (tapb755f0b1-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:08:20 compute-0 NetworkManager[57207]: <info>  [1771844900.7123] device (tapb755f0b1-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:08:20 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.719 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[18eeef5d-ad44-41db-b71e-1d1ead3a7413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.735 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[975cb18a-eba1-408c-97e2-ff6bf50764ef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.756 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[86d1ec59-2671-4630-a1f2-7eaaac82dd66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.761 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0930a5f5-1e8b-4f59-9e63-ece4a41b5e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 NetworkManager[57207]: <info>  [1771844900.7627] manager: (tap4b12da8d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.790 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[538f5502-44c4-4c14-9a5f-cb28f6590c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.795 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[1df12e4e-a114-4f1c-b0e8-8b3dbed69306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 NetworkManager[57207]: <info>  [1771844900.8138] device (tap4b12da8d-30): carrier: link connected
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.818 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[f8173e0d-87ae-4caf-b046-b333bf316641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.834 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7eaca681-c31b-45e3-8b84-92094d96f30c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404498, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212894, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.848 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3e0bef-732c-4ca8-bb84-44fb3f65d7ff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8a8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404498, 'tstamp': 404498}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212895, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.868 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[04064600-54d6-4808-b125-e75a4d1cf3ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404498, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212896, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.896 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[463013bc-7ee1-4e56-990a-c4d002d62444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.904 187643 DEBUG nova.compute.manager [req-ad6d6ecc-2473-4a00-b9e5-dc7d9eda3c3e req-f618a25d-64e4-4925-9ba4-bf991e71bc69 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.905 187643 DEBUG oslo_concurrency.lockutils [req-ad6d6ecc-2473-4a00-b9e5-dc7d9eda3c3e req-f618a25d-64e4-4925-9ba4-bf991e71bc69 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.905 187643 DEBUG oslo_concurrency.lockutils [req-ad6d6ecc-2473-4a00-b9e5-dc7d9eda3c3e req-f618a25d-64e4-4925-9ba4-bf991e71bc69 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.905 187643 DEBUG oslo_concurrency.lockutils [req-ad6d6ecc-2473-4a00-b9e5-dc7d9eda3c3e req-f618a25d-64e4-4925-9ba4-bf991e71bc69 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.905 187643 DEBUG nova.compute.manager [req-ad6d6ecc-2473-4a00-b9e5-dc7d9eda3c3e req-f618a25d-64e4-4925-9ba4-bf991e71bc69 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Processing event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.958 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e557bee6-2fe8-4fb0-8448-6106bfba5650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.959 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.960 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.960 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:08:20 compute-0 NetworkManager[57207]: <info>  [1771844900.9634] manager: (tap4b12da8d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.963 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:20 compute-0 kernel: tap4b12da8d-30: entered promiscuous mode
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.966 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:08:20 compute-0 ovn_controller[97601]: 2026-02-23T11:08:20Z|00122|binding|INFO|Releasing lport 586378da-906d-4768-bab7-0954450c4a57 from this chassis (sb_readonly=0)
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.974 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:08:20 compute-0 nova_compute[187639]: 2026-02-23 11:08:20.974 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.975 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[9cda2dc6-29b0-4242-b59f-1e2efc2dbb3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.975 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:08:20 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:20.976 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'env', 'PROCESS_TAG=haproxy-4b12da8d-3150-4d44-b948-8d49ddadedef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b12da8d-3150-4d44-b948-8d49ddadedef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.033 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844901.032849, 5282a8f9-2db6-47c7-bbcb-061973c8b999 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.033 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] VM Started (Lifecycle Event)
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.037 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.042 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.046 187643 INFO nova.virt.libvirt.driver [-] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Instance spawned successfully.
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.046 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.058 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.064 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.076 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.077 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.077 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.078 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.079 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.080 187643 DEBUG nova.virt.libvirt.driver [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.086 187643 DEBUG nova.network.neutron [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Updated VIF entry in instance network info cache for port b755f0b1-c231-4652-bc2c-71033179f14c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.086 187643 DEBUG nova.network.neutron [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Updating instance_info_cache with network_info: [{"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.090 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.090 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844901.0329487, 5282a8f9-2db6-47c7-bbcb-061973c8b999 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.090 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] VM Paused (Lifecycle Event)
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.122 187643 DEBUG oslo_concurrency.lockutils [req-8451ba71-d1af-4526-a88c-705e3c49c85f req-de13d574-8d93-4696-9f61-7d08bfadd71e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.144 187643 INFO nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Took 4.39 seconds to spawn the instance on the hypervisor.
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.145 187643 DEBUG nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.154 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.160 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844901.0413578, 5282a8f9-2db6-47c7-bbcb-061973c8b999 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.161 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] VM Resumed (Lifecycle Event)
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.189 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.193 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.207 187643 INFO nova.compute.manager [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Took 4.88 seconds to build instance.
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.227 187643 DEBUG oslo_concurrency.lockutils [None req-2c1d870e-4370-4907-b9ce-c4fbde07b272 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.227 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.228 187643 INFO nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.228 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:21 compute-0 sshd-session[212913]: Invalid user admin from 143.198.30.3 port 35156
Feb 23 11:08:21 compute-0 sshd-session[212913]: Connection closed by invalid user admin 143.198.30.3 port 35156 [preauth]
Feb 23 11:08:21 compute-0 podman[212929]: 2026-02-23 11:08:21.313572628 +0000 UTC m=+0.055122180 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 11:08:21 compute-0 podman[212940]: 2026-02-23 11:08:21.337306027 +0000 UTC m=+0.060397861 container create 6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:08:21 compute-0 systemd[1]: Started libpod-conmon-6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb.scope.
Feb 23 11:08:21 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:08:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55d4410028cc55e74a3b93ae8ec5ea7312bff0b5457f3f5247765794cf0e2018/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:08:21 compute-0 podman[212940]: 2026-02-23 11:08:21.306245664 +0000 UTC m=+0.029337508 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:08:21 compute-0 podman[212940]: 2026-02-23 11:08:21.407396373 +0000 UTC m=+0.130488237 container init 6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216)
Feb 23 11:08:21 compute-0 podman[212940]: 2026-02-23 11:08:21.412698713 +0000 UTC m=+0.135790527 container start 6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 11:08:21 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212969]: [NOTICE]   (212973) : New worker (212975) forked
Feb 23 11:08:21 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212969]: [NOTICE]   (212973) : Loading success.
Feb 23 11:08:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:08:21.716 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:08:21 compute-0 nova_compute[187639]: 2026-02-23 11:08:21.979 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:22 compute-0 nova_compute[187639]: 2026-02-23 11:08:22.993 187643 DEBUG nova.compute.manager [req-6f72993c-a912-4dc6-bb07-d072812ab763 req-7bfb24a2-5b35-45f1-af41-6f4f73d7d4c4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:08:22 compute-0 nova_compute[187639]: 2026-02-23 11:08:22.994 187643 DEBUG oslo_concurrency.lockutils [req-6f72993c-a912-4dc6-bb07-d072812ab763 req-7bfb24a2-5b35-45f1-af41-6f4f73d7d4c4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:08:22 compute-0 nova_compute[187639]: 2026-02-23 11:08:22.994 187643 DEBUG oslo_concurrency.lockutils [req-6f72993c-a912-4dc6-bb07-d072812ab763 req-7bfb24a2-5b35-45f1-af41-6f4f73d7d4c4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:08:22 compute-0 nova_compute[187639]: 2026-02-23 11:08:22.994 187643 DEBUG oslo_concurrency.lockutils [req-6f72993c-a912-4dc6-bb07-d072812ab763 req-7bfb24a2-5b35-45f1-af41-6f4f73d7d4c4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:08:22 compute-0 nova_compute[187639]: 2026-02-23 11:08:22.995 187643 DEBUG nova.compute.manager [req-6f72993c-a912-4dc6-bb07-d072812ab763 req-7bfb24a2-5b35-45f1-af41-6f4f73d7d4c4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] No waiting events found dispatching network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:08:22 compute-0 nova_compute[187639]: 2026-02-23 11:08:22.995 187643 WARNING nova.compute.manager [req-6f72993c-a912-4dc6-bb07-d072812ab763 req-7bfb24a2-5b35-45f1-af41-6f4f73d7d4c4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received unexpected event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c for instance with vm_state active and task_state None.
Feb 23 11:08:24 compute-0 nova_compute[187639]: 2026-02-23 11:08:24.958 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:26 compute-0 podman[212984]: 2026-02-23 11:08:26.856229145 +0000 UTC m=+0.063698818 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 11:08:27 compute-0 nova_compute[187639]: 2026-02-23 11:08:27.025 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:29 compute-0 podman[197002]: time="2026-02-23T11:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:08:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:08:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 23 11:08:29 compute-0 nova_compute[187639]: 2026-02-23 11:08:29.960 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:31 compute-0 openstack_network_exporter[199919]: ERROR   11:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:08:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:08:31 compute-0 openstack_network_exporter[199919]: ERROR   11:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:08:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:08:31 compute-0 podman[213011]: 2026-02-23 11:08:31.486418999 +0000 UTC m=+0.044828258 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 11:08:32 compute-0 nova_compute[187639]: 2026-02-23 11:08:32.065 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:33 compute-0 ovn_controller[97601]: 2026-02-23T11:08:33Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:41:2a 10.100.0.14
Feb 23 11:08:33 compute-0 ovn_controller[97601]: 2026-02-23T11:08:33Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:41:2a 10.100.0.14
Feb 23 11:08:34 compute-0 nova_compute[187639]: 2026-02-23 11:08:34.962 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:37 compute-0 nova_compute[187639]: 2026-02-23 11:08:37.158 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:39 compute-0 nova_compute[187639]: 2026-02-23 11:08:39.964 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:42 compute-0 nova_compute[187639]: 2026-02-23 11:08:42.162 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:44 compute-0 podman[213048]: 2026-02-23 11:08:44.834358401 +0000 UTC m=+0.042137397 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 11:08:44 compute-0 nova_compute[187639]: 2026-02-23 11:08:44.964 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:47 compute-0 nova_compute[187639]: 2026-02-23 11:08:47.194 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:49 compute-0 nova_compute[187639]: 2026-02-23 11:08:49.966 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:51 compute-0 sshd-session[213072]: Invalid user admin from 143.198.30.3 port 36868
Feb 23 11:08:51 compute-0 sshd-session[213072]: Connection closed by invalid user admin 143.198.30.3 port 36868 [preauth]
Feb 23 11:08:51 compute-0 podman[213074]: 2026-02-23 11:08:51.652831484 +0000 UTC m=+0.071179836 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Feb 23 11:08:52 compute-0 nova_compute[187639]: 2026-02-23 11:08:52.238 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:54 compute-0 sshd-session[213093]: Connection closed by authenticating user root 165.227.79.48 port 34760 [preauth]
Feb 23 11:08:54 compute-0 nova_compute[187639]: 2026-02-23 11:08:54.969 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:57 compute-0 nova_compute[187639]: 2026-02-23 11:08:57.283 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:08:57 compute-0 ovn_controller[97601]: 2026-02-23T11:08:57Z|00123|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Feb 23 11:08:57 compute-0 podman[213096]: 2026-02-23 11:08:57.893392323 +0000 UTC m=+0.089414369 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216)
Feb 23 11:08:59 compute-0 nova_compute[187639]: 2026-02-23 11:08:59.712 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:08:59 compute-0 podman[197002]: time="2026-02-23T11:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:08:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:08:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2630 "" "Go-http-client/1.1"
Feb 23 11:08:59 compute-0 nova_compute[187639]: 2026-02-23 11:08:59.970 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:01 compute-0 openstack_network_exporter[199919]: ERROR   11:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:09:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:09:01 compute-0 openstack_network_exporter[199919]: ERROR   11:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:09:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:09:01 compute-0 nova_compute[187639]: 2026-02-23 11:09:01.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:01 compute-0 podman[213123]: 2026-02-23 11:09:01.863960839 +0000 UTC m=+0.065793764 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Feb 23 11:09:02 compute-0 nova_compute[187639]: 2026-02-23 11:09:02.353 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:02 compute-0 nova_compute[187639]: 2026-02-23 11:09:02.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:02 compute-0 nova_compute[187639]: 2026-02-23 11:09:02.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:09:02 compute-0 nova_compute[187639]: 2026-02-23 11:09:02.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:09:03 compute-0 nova_compute[187639]: 2026-02-23 11:09:03.214 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:09:03 compute-0 nova_compute[187639]: 2026-02-23 11:09:03.215 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:09:03 compute-0 nova_compute[187639]: 2026-02-23 11:09:03.215 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:09:03 compute-0 nova_compute[187639]: 2026-02-23 11:09:03.215 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5282a8f9-2db6-47c7-bbcb-061973c8b999 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.373 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Updating instance_info_cache with network_info: [{"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.398 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-5282a8f9-2db6-47c7-bbcb-061973c8b999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.398 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.398 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.399 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.399 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.399 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.717 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.719 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.783 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.838 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.839 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.912 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:04 compute-0 nova_compute[187639]: 2026-02-23 11:09:04.972 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.052 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.055 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5628MB free_disk=73.17645263671875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.055 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.056 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.239 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance 5282a8f9-2db6-47c7-bbcb-061973c8b999 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.240 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.240 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.301 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.318 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.352 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:09:05 compute-0 nova_compute[187639]: 2026-02-23 11:09:05.353 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:06 compute-0 nova_compute[187639]: 2026-02-23 11:09:06.349 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:07 compute-0 nova_compute[187639]: 2026-02-23 11:09:07.354 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:09 compute-0 nova_compute[187639]: 2026-02-23 11:09:09.975 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:10 compute-0 nova_compute[187639]: 2026-02-23 11:09:10.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:12 compute-0 nova_compute[187639]: 2026-02-23 11:09:12.396 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:12.653 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:12.654 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:12.654 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:14 compute-0 nova_compute[187639]: 2026-02-23 11:09:14.976 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:15 compute-0 nova_compute[187639]: 2026-02-23 11:09:15.296 187643 DEBUG nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Creating tmpfile /var/lib/nova/instances/tmpurzcgwrj to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 23 11:09:15 compute-0 nova_compute[187639]: 2026-02-23 11:09:15.411 187643 DEBUG nova.compute.manager [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurzcgwrj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 23 11:09:15 compute-0 podman[213151]: 2026-02-23 11:09:15.865975152 +0000 UTC m=+0.053866447 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:09:16 compute-0 nova_compute[187639]: 2026-02-23 11:09:16.431 187643 DEBUG nova.compute.manager [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurzcgwrj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='76696b0e-6027-4a27-8100-8528dd9c1fd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 23 11:09:16 compute-0 nova_compute[187639]: 2026-02-23 11:09:16.460 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-76696b0e-6027-4a27-8100-8528dd9c1fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:09:16 compute-0 nova_compute[187639]: 2026-02-23 11:09:16.461 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-76696b0e-6027-4a27-8100-8528dd9c1fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:09:16 compute-0 nova_compute[187639]: 2026-02-23 11:09:16.462 187643 DEBUG nova.network.neutron [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:09:17 compute-0 nova_compute[187639]: 2026-02-23 11:09:17.397 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.641 187643 DEBUG nova.network.neutron [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Updating instance_info_cache with network_info: [{"id": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "address": "fa:16:3e:ac:da:02", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69498eaa-e7", "ovs_interfaceid": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.659 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-76696b0e-6027-4a27-8100-8528dd9c1fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.662 187643 DEBUG nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurzcgwrj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='76696b0e-6027-4a27-8100-8528dd9c1fd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.663 187643 DEBUG nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Creating instance directory: /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.664 187643 DEBUG nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Creating disk.info with the contents: {'/var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk': 'qcow2', '/var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.665 187643 DEBUG nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.665 187643 DEBUG nova.objects.instance [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 76696b0e-6027-4a27-8100-8528dd9c1fd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.706 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.770 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.771 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.772 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.787 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.875 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.876 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.915 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.916 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.917 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.974 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.975 187643 DEBUG nova.virt.disk.api [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Checking if we can resize image /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:09:18 compute-0 nova_compute[187639]: 2026-02-23 11:09:18.976 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.023 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.023 187643 DEBUG nova.virt.disk.api [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Cannot resize image /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.024 187643 DEBUG nova.objects.instance [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid 76696b0e-6027-4a27-8100-8528dd9c1fd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.040 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.055 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk.config 485376" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.056 187643 DEBUG nova.virt.libvirt.volume.remotefs [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk.config to /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.056 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk.config /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.496 187643 DEBUG oslo_concurrency.processutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7/disk.config /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.498 187643 DEBUG nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.499 187643 DEBUG nova.virt.libvirt.vif [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1212057328',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1212057328',id=16,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:08:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-iw03pwz6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:08:31Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=76696b0e-6027-4a27-8100-8528dd9c1fd7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "address": "fa:16:3e:ac:da:02", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap69498eaa-e7", "ovs_interfaceid": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.499 187643 DEBUG nova.network.os_vif_util [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "address": "fa:16:3e:ac:da:02", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap69498eaa-e7", "ovs_interfaceid": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.500 187643 DEBUG nova.network.os_vif_util [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:da:02,bridge_name='br-int',has_traffic_filtering=True,id=69498eaa-e796-4cb8-9bcf-36a86a97cd89,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69498eaa-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.501 187643 DEBUG os_vif [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:da:02,bridge_name='br-int',has_traffic_filtering=True,id=69498eaa-e796-4cb8-9bcf-36a86a97cd89,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69498eaa-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.502 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.503 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.503 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.506 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.506 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69498eaa-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.506 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69498eaa-e7, col_values=(('external_ids', {'iface-id': '69498eaa-e796-4cb8-9bcf-36a86a97cd89', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:da:02', 'vm-uuid': '76696b0e-6027-4a27-8100-8528dd9c1fd7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.508 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:19 compute-0 NetworkManager[57207]: <info>  [1771844959.5090] manager: (tap69498eaa-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.511 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.513 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.514 187643 INFO os_vif [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:da:02,bridge_name='br-int',has_traffic_filtering=True,id=69498eaa-e796-4cb8-9bcf-36a86a97cd89,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69498eaa-e7')
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.515 187643 DEBUG nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 23 11:09:19 compute-0 nova_compute[187639]: 2026-02-23 11:09:19.515 187643 DEBUG nova.compute.manager [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurzcgwrj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='76696b0e-6027-4a27-8100-8528dd9c1fd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 23 11:09:20 compute-0 nova_compute[187639]: 2026-02-23 11:09:20.551 187643 DEBUG nova.network.neutron [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Port 69498eaa-e796-4cb8-9bcf-36a86a97cd89 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 23 11:09:20 compute-0 nova_compute[187639]: 2026-02-23 11:09:20.553 187643 DEBUG nova.compute.manager [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurzcgwrj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='76696b0e-6027-4a27-8100-8528dd9c1fd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 23 11:09:20 compute-0 kernel: tap69498eaa-e7: entered promiscuous mode
Feb 23 11:09:20 compute-0 ovn_controller[97601]: 2026-02-23T11:09:20Z|00124|binding|INFO|Claiming lport 69498eaa-e796-4cb8-9bcf-36a86a97cd89 for this additional chassis.
Feb 23 11:09:20 compute-0 ovn_controller[97601]: 2026-02-23T11:09:20Z|00125|binding|INFO|69498eaa-e796-4cb8-9bcf-36a86a97cd89: Claiming fa:16:3e:ac:da:02 10.100.0.9
Feb 23 11:09:20 compute-0 NetworkManager[57207]: <info>  [1771844960.8200] manager: (tap69498eaa-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Feb 23 11:09:20 compute-0 nova_compute[187639]: 2026-02-23 11:09:20.820 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:20 compute-0 ovn_controller[97601]: 2026-02-23T11:09:20Z|00126|binding|INFO|Setting lport 69498eaa-e796-4cb8-9bcf-36a86a97cd89 ovn-installed in OVS
Feb 23 11:09:20 compute-0 nova_compute[187639]: 2026-02-23 11:09:20.825 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:20 compute-0 nova_compute[187639]: 2026-02-23 11:09:20.830 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:20 compute-0 systemd-machined[156970]: New machine qemu-12-instance-00000010.
Feb 23 11:09:20 compute-0 systemd-udevd[213214]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:09:20 compute-0 NetworkManager[57207]: <info>  [1771844960.8707] device (tap69498eaa-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:09:20 compute-0 NetworkManager[57207]: <info>  [1771844960.8721] device (tap69498eaa-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:09:20 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000010.
Feb 23 11:09:21 compute-0 podman[213230]: 2026-02-23 11:09:21.876466629 +0000 UTC m=+0.068219047 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:09:21 compute-0 nova_compute[187639]: 2026-02-23 11:09:21.885 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844961.8849604, 76696b0e-6027-4a27-8100-8528dd9c1fd7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:09:21 compute-0 nova_compute[187639]: 2026-02-23 11:09:21.886 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] VM Started (Lifecycle Event)
Feb 23 11:09:21 compute-0 nova_compute[187639]: 2026-02-23 11:09:21.913 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:09:22 compute-0 nova_compute[187639]: 2026-02-23 11:09:22.431 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:22 compute-0 nova_compute[187639]: 2026-02-23 11:09:22.567 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771844962.5674412, 76696b0e-6027-4a27-8100-8528dd9c1fd7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:09:22 compute-0 nova_compute[187639]: 2026-02-23 11:09:22.568 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] VM Resumed (Lifecycle Event)
Feb 23 11:09:22 compute-0 nova_compute[187639]: 2026-02-23 11:09:22.588 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:09:22 compute-0 nova_compute[187639]: 2026-02-23 11:09:22.591 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:09:22 compute-0 nova_compute[187639]: 2026-02-23 11:09:22.621 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.448 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:09:23 compute-0 nova_compute[187639]: 2026-02-23 11:09:23.449 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.450 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:09:23 compute-0 ovn_controller[97601]: 2026-02-23T11:09:23Z|00127|binding|INFO|Claiming lport 69498eaa-e796-4cb8-9bcf-36a86a97cd89 for this chassis.
Feb 23 11:09:23 compute-0 ovn_controller[97601]: 2026-02-23T11:09:23Z|00128|binding|INFO|69498eaa-e796-4cb8-9bcf-36a86a97cd89: Claiming fa:16:3e:ac:da:02 10.100.0.9
Feb 23 11:09:23 compute-0 ovn_controller[97601]: 2026-02-23T11:09:23Z|00129|binding|INFO|Setting lport 69498eaa-e796-4cb8-9bcf-36a86a97cd89 up in Southbound
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.784 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:da:02 10.100.0.9'], port_security=['fa:16:3e:ac:da:02 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '76696b0e-6027-4a27-8100-8528dd9c1fd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '11', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=69498eaa-e796-4cb8-9bcf-36a86a97cd89) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.786 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 69498eaa-e796-4cb8-9bcf-36a86a97cd89 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.790 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.806 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[734fd9fc-1d14-45a8-9bc7-0cecc742c347]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.829 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[1e5cb629-5c3a-4ef2-973b-a24e9b05f95a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.832 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[b414bfd9-70dd-4af0-9ada-9aafc7e55762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.855 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[de5bcef7-910a-4fec-a0ec-6d35c0c3a9cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.871 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f091f6e3-edb5-4659-8d6f-bc6d7652e402]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404498, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213262, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.880 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a52098-85a1-4d32-8984-1e62bd281665]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404509, 'tstamp': 404509}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213263, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404511, 'tstamp': 404511}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213263, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.882 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:23 compute-0 nova_compute[187639]: 2026-02-23 11:09:23.883 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:23 compute-0 nova_compute[187639]: 2026-02-23 11:09:23.884 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.885 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.885 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.885 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:23.886 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:09:23 compute-0 nova_compute[187639]: 2026-02-23 11:09:23.987 187643 INFO nova.compute.manager [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Post operation of migration started
Feb 23 11:09:24 compute-0 nova_compute[187639]: 2026-02-23 11:09:24.334 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-76696b0e-6027-4a27-8100-8528dd9c1fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:09:24 compute-0 nova_compute[187639]: 2026-02-23 11:09:24.334 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-76696b0e-6027-4a27-8100-8528dd9c1fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:09:24 compute-0 nova_compute[187639]: 2026-02-23 11:09:24.335 187643 DEBUG nova.network.neutron [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:09:24 compute-0 nova_compute[187639]: 2026-02-23 11:09:24.538 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:24 compute-0 sshd-session[213264]: Invalid user admin from 143.198.30.3 port 54156
Feb 23 11:09:24 compute-0 sshd-session[213264]: Connection closed by invalid user admin 143.198.30.3 port 54156 [preauth]
Feb 23 11:09:25 compute-0 nova_compute[187639]: 2026-02-23 11:09:25.953 187643 DEBUG nova.network.neutron [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Updating instance_info_cache with network_info: [{"id": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "address": "fa:16:3e:ac:da:02", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69498eaa-e7", "ovs_interfaceid": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:09:25 compute-0 nova_compute[187639]: 2026-02-23 11:09:25.970 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-76696b0e-6027-4a27-8100-8528dd9c1fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:09:25 compute-0 nova_compute[187639]: 2026-02-23 11:09:25.991 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:25 compute-0 nova_compute[187639]: 2026-02-23 11:09:25.991 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:25 compute-0 nova_compute[187639]: 2026-02-23 11:09:25.992 187643 DEBUG oslo_concurrency.lockutils [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:25 compute-0 nova_compute[187639]: 2026-02-23 11:09:25.996 187643 INFO nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 23 11:09:25 compute-0 virtqemud[186733]: Domain id=12 name='instance-00000010' uuid=76696b0e-6027-4a27-8100-8528dd9c1fd7 is tainted: custom-monitor
Feb 23 11:09:27 compute-0 nova_compute[187639]: 2026-02-23 11:09:27.002 187643 INFO nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 23 11:09:27 compute-0 nova_compute[187639]: 2026-02-23 11:09:27.434 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:28 compute-0 nova_compute[187639]: 2026-02-23 11:09:28.007 187643 INFO nova.virt.libvirt.driver [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 23 11:09:28 compute-0 nova_compute[187639]: 2026-02-23 11:09:28.011 187643 DEBUG nova.compute.manager [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:09:28 compute-0 nova_compute[187639]: 2026-02-23 11:09:28.029 187643 DEBUG nova.objects.instance [None req-7d7efec5-6354-4aa9-b07c-453fc985aaba a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 23 11:09:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:28.452 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:28 compute-0 podman[213266]: 2026-02-23 11:09:28.908864586 +0000 UTC m=+0.109988153 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 11:09:29 compute-0 nova_compute[187639]: 2026-02-23 11:09:29.540 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:29 compute-0 podman[197002]: time="2026-02-23T11:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:09:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:09:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Feb 23 11:09:31 compute-0 openstack_network_exporter[199919]: ERROR   11:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:09:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:09:31 compute-0 openstack_network_exporter[199919]: ERROR   11:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:09:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:09:32 compute-0 nova_compute[187639]: 2026-02-23 11:09:32.459 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:32 compute-0 podman[213293]: 2026-02-23 11:09:32.883463839 +0000 UTC m=+0.080713948 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 23 11:09:34 compute-0 nova_compute[187639]: 2026-02-23 11:09:34.542 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.021 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "76696b0e-6027-4a27-8100-8528dd9c1fd7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.022 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.022 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.022 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.022 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.023 187643 INFO nova.compute.manager [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Terminating instance
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.024 187643 DEBUG nova.compute.manager [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:09:35 compute-0 kernel: tap69498eaa-e7 (unregistering): left promiscuous mode
Feb 23 11:09:35 compute-0 ovn_controller[97601]: 2026-02-23T11:09:35Z|00130|binding|INFO|Releasing lport 69498eaa-e796-4cb8-9bcf-36a86a97cd89 from this chassis (sb_readonly=0)
Feb 23 11:09:35 compute-0 ovn_controller[97601]: 2026-02-23T11:09:35Z|00131|binding|INFO|Setting lport 69498eaa-e796-4cb8-9bcf-36a86a97cd89 down in Southbound
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.060 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 ovn_controller[97601]: 2026-02-23T11:09:35Z|00132|binding|INFO|Removing iface tap69498eaa-e7 ovn-installed in OVS
Feb 23 11:09:35 compute-0 NetworkManager[57207]: <info>  [1771844975.0615] device (tap69498eaa-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.062 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.067 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.067 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:da:02 10.100.0.9'], port_security=['fa:16:3e:ac:da:02 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '76696b0e-6027-4a27-8100-8528dd9c1fd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '13', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=69498eaa-e796-4cb8-9bcf-36a86a97cd89) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.068 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 69498eaa-e796-4cb8-9bcf-36a86a97cd89 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.069 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.082 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1d358c-19af-4e39-a105-ed3a55fdc5eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.103 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4b9b01-82a0-46e9-941d-b11345d6c6e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.106 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[99c4fa40-874d-427d-94c3-b2fc83ef831d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:35 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Deactivated successfully.
Feb 23 11:09:35 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Consumed 1.818s CPU time.
Feb 23 11:09:35 compute-0 systemd-machined[156970]: Machine qemu-12-instance-00000010 terminated.
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.125 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5cf1ce-9349-4ecc-976d-130399f5c6ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.139 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1727428e-22f4-477c-8010-772f2a43f6e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404498, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213343, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.151 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7e5d2a-3a44-471b-ae80-211f80963015]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404509, 'tstamp': 404509}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213344, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404511, 'tstamp': 404511}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213344, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.152 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.153 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.156 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.157 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.157 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.157 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:35 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:35.158 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.276 187643 INFO nova.virt.libvirt.driver [-] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Instance destroyed successfully.
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.277 187643 DEBUG nova.objects.instance [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid 76696b0e-6027-4a27-8100-8528dd9c1fd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.298 187643 DEBUG nova.virt.libvirt.vif [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-23T11:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1212057328',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1212057328',id=16,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:08:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-iw03pwz6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:09:28Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=76696b0e-6027-4a27-8100-8528dd9c1fd7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "address": "fa:16:3e:ac:da:02", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69498eaa-e7", "ovs_interfaceid": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.299 187643 DEBUG nova.network.os_vif_util [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "address": "fa:16:3e:ac:da:02", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69498eaa-e7", "ovs_interfaceid": "69498eaa-e796-4cb8-9bcf-36a86a97cd89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.299 187643 DEBUG nova.network.os_vif_util [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:da:02,bridge_name='br-int',has_traffic_filtering=True,id=69498eaa-e796-4cb8-9bcf-36a86a97cd89,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69498eaa-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.300 187643 DEBUG os_vif [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:da:02,bridge_name='br-int',has_traffic_filtering=True,id=69498eaa-e796-4cb8-9bcf-36a86a97cd89,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69498eaa-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.301 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.302 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69498eaa-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.303 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.306 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.307 187643 INFO os_vif [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:da:02,bridge_name='br-int',has_traffic_filtering=True,id=69498eaa-e796-4cb8-9bcf-36a86a97cd89,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69498eaa-e7')
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.308 187643 INFO nova.virt.libvirt.driver [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Deleting instance files /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7_del
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.309 187643 INFO nova.virt.libvirt.driver [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Deletion of /var/lib/nova/instances/76696b0e-6027-4a27-8100-8528dd9c1fd7_del complete
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.382 187643 INFO nova.compute.manager [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.382 187643 DEBUG oslo.service.loopingcall [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.383 187643 DEBUG nova.compute.manager [-] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.383 187643 DEBUG nova.network.neutron [-] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.434 187643 DEBUG nova.compute.manager [req-64d4fa42-1b24-4366-80b7-e697265315c4 req-f6347ae5-e1cc-4b8e-bdbc-0af3fac7db76 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Received event network-vif-unplugged-69498eaa-e796-4cb8-9bcf-36a86a97cd89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.434 187643 DEBUG oslo_concurrency.lockutils [req-64d4fa42-1b24-4366-80b7-e697265315c4 req-f6347ae5-e1cc-4b8e-bdbc-0af3fac7db76 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.434 187643 DEBUG oslo_concurrency.lockutils [req-64d4fa42-1b24-4366-80b7-e697265315c4 req-f6347ae5-e1cc-4b8e-bdbc-0af3fac7db76 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.435 187643 DEBUG oslo_concurrency.lockutils [req-64d4fa42-1b24-4366-80b7-e697265315c4 req-f6347ae5-e1cc-4b8e-bdbc-0af3fac7db76 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.435 187643 DEBUG nova.compute.manager [req-64d4fa42-1b24-4366-80b7-e697265315c4 req-f6347ae5-e1cc-4b8e-bdbc-0af3fac7db76 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] No waiting events found dispatching network-vif-unplugged-69498eaa-e796-4cb8-9bcf-36a86a97cd89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.435 187643 DEBUG nova.compute.manager [req-64d4fa42-1b24-4366-80b7-e697265315c4 req-f6347ae5-e1cc-4b8e-bdbc-0af3fac7db76 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Received event network-vif-unplugged-69498eaa-e796-4cb8-9bcf-36a86a97cd89 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.948 187643 DEBUG nova.network.neutron [-] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:09:35 compute-0 nova_compute[187639]: 2026-02-23 11:09:35.965 187643 INFO nova.compute.manager [-] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Took 0.58 seconds to deallocate network for instance.
Feb 23 11:09:36 compute-0 nova_compute[187639]: 2026-02-23 11:09:36.019 187643 DEBUG nova.compute.manager [req-4066818c-451c-474e-ad39-582089b56b69 req-55258809-7d50-4cef-a118-9da1cd6777a7 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Received event network-vif-deleted-69498eaa-e796-4cb8-9bcf-36a86a97cd89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:36 compute-0 nova_compute[187639]: 2026-02-23 11:09:36.020 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:36 compute-0 nova_compute[187639]: 2026-02-23 11:09:36.020 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:36 compute-0 nova_compute[187639]: 2026-02-23 11:09:36.025 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:36 compute-0 nova_compute[187639]: 2026-02-23 11:09:36.053 187643 INFO nova.scheduler.client.report [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance 76696b0e-6027-4a27-8100-8528dd9c1fd7
Feb 23 11:09:36 compute-0 nova_compute[187639]: 2026-02-23 11:09:36.098 187643 DEBUG oslo_concurrency.lockutils [None req-f904d7af-d5b6-4acf-8530-109cd36d14e4 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.014 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.014 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.015 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.015 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.015 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.017 187643 INFO nova.compute.manager [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Terminating instance
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.018 187643 DEBUG nova.compute.manager [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:09:37 compute-0 kernel: tapb755f0b1-c2 (unregistering): left promiscuous mode
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.049 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 NetworkManager[57207]: <info>  [1771844977.0503] device (tapb755f0b1-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.053 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00133|binding|INFO|Releasing lport b755f0b1-c231-4652-bc2c-71033179f14c from this chassis (sb_readonly=0)
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00134|binding|INFO|Setting lport b755f0b1-c231-4652-bc2c-71033179f14c down in Southbound
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00135|binding|INFO|Removing iface tapb755f0b1-c2 ovn-installed in OVS
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.055 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.057 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.062 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:41:2a 10.100.0.14'], port_security=['fa:16:3e:0e:41:2a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5282a8f9-2db6-47c7-bbcb-061973c8b999', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b755f0b1-c231-4652-bc2c-71033179f14c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.064 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b755f0b1-c231-4652-bc2c-71033179f14c in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.066 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.067 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5f278cc6-c776-4fa9-b8d3-6c4b8b46bcbf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.067 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace which is not needed anymore
Feb 23 11:09:37 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Feb 23 11:09:37 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 13.964s CPU time.
Feb 23 11:09:37 compute-0 systemd-machined[156970]: Machine qemu-11-instance-0000000f terminated.
Feb 23 11:09:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212969]: [NOTICE]   (212973) : haproxy version is 2.8.14-c23fe91
Feb 23 11:09:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212969]: [NOTICE]   (212973) : path to executable is /usr/sbin/haproxy
Feb 23 11:09:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212969]: [WARNING]  (212973) : Exiting Master process...
Feb 23 11:09:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212969]: [ALERT]    (212973) : Current worker (212975) exited with code 143 (Terminated)
Feb 23 11:09:37 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[212969]: [WARNING]  (212973) : All workers exited. Exiting... (0)
Feb 23 11:09:37 compute-0 systemd[1]: libpod-6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb.scope: Deactivated successfully.
Feb 23 11:09:37 compute-0 conmon[212969]: conmon 6576c906780b3eaf0a93 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb.scope/container/memory.events
Feb 23 11:09:37 compute-0 podman[213384]: 2026-02-23 11:09:37.19371095 +0000 UTC m=+0.041515321 container died 6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 11:09:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb-userdata-shm.mount: Deactivated successfully.
Feb 23 11:09:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-55d4410028cc55e74a3b93ae8ec5ea7312bff0b5457f3f5247765794cf0e2018-merged.mount: Deactivated successfully.
Feb 23 11:09:37 compute-0 podman[213384]: 2026-02-23 11:09:37.220247252 +0000 UTC m=+0.068051583 container cleanup 6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 11:09:37 compute-0 systemd[1]: libpod-conmon-6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb.scope: Deactivated successfully.
Feb 23 11:09:37 compute-0 kernel: tapb755f0b1-c2: entered promiscuous mode
Feb 23 11:09:37 compute-0 NetworkManager[57207]: <info>  [1771844977.2355] manager: (tapb755f0b1-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Feb 23 11:09:37 compute-0 kernel: tapb755f0b1-c2 (unregistering): left promiscuous mode
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.242 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00136|binding|INFO|Claiming lport b755f0b1-c231-4652-bc2c-71033179f14c for this chassis.
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00137|binding|INFO|b755f0b1-c231-4652-bc2c-71033179f14c: Claiming fa:16:3e:0e:41:2a 10.100.0.14
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.252 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:41:2a 10.100.0.14'], port_security=['fa:16:3e:0e:41:2a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5282a8f9-2db6-47c7-bbcb-061973c8b999', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b755f0b1-c231-4652-bc2c-71033179f14c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.255 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00138|binding|INFO|Setting lport b755f0b1-c231-4652-bc2c-71033179f14c ovn-installed in OVS
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00139|binding|INFO|Setting lport b755f0b1-c231-4652-bc2c-71033179f14c up in Southbound
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.257 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.275 187643 INFO nova.virt.libvirt.driver [-] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Instance destroyed successfully.
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.276 187643 DEBUG nova.objects.instance [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid 5282a8f9-2db6-47c7-bbcb-061973c8b999 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:09:37 compute-0 podman[213414]: 2026-02-23 11:09:37.295943867 +0000 UTC m=+0.060357879 container remove 6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.305 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[589f59fe-f4a1-476a-80df-c1a58b8bc4d1]: (4, ('Mon Feb 23 11:09:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb)\n6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb\nMon Feb 23 11:09:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb)\n6576c906780b3eaf0a9395b28cbc0f55f7a7e2456ad814004a10b95692de05eb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.306 187643 DEBUG nova.virt.libvirt.vif [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:08:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-935332689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-935332689',id=15,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:08:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-19c7ppbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:08:21Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=5282a8f9-2db6-47c7-bbcb-061973c8b999,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.306 187643 DEBUG nova.network.os_vif_util [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "b755f0b1-c231-4652-bc2c-71033179f14c", "address": "fa:16:3e:0e:41:2a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb755f0b1-c2", "ovs_interfaceid": "b755f0b1-c231-4652-bc2c-71033179f14c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.307 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc82a16-f364-495b-98af-218905ceb475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.307 187643 DEBUG nova.network.os_vif_util [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0e:41:2a,bridge_name='br-int',has_traffic_filtering=True,id=b755f0b1-c231-4652-bc2c-71033179f14c,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb755f0b1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.308 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.308 187643 DEBUG os_vif [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:41:2a,bridge_name='br-int',has_traffic_filtering=True,id=b755f0b1-c231-4652-bc2c-71033179f14c,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb755f0b1-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:09:37 compute-0 kernel: tap4b12da8d-30: left promiscuous mode
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.310 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.311 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb755f0b1-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00140|binding|INFO|Releasing lport b755f0b1-c231-4652-bc2c-71033179f14c from this chassis (sb_readonly=0)
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.312 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 ovn_controller[97601]: 2026-02-23T11:09:37Z|00141|binding|INFO|Setting lport b755f0b1-c231-4652-bc2c-71033179f14c down in Southbound
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.314 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.318 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:41:2a 10.100.0.14'], port_security=['fa:16:3e:0e:41:2a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5282a8f9-2db6-47c7-bbcb-061973c8b999', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b755f0b1-c231-4652-bc2c-71033179f14c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.319 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.324 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.325 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.326 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.327 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[747dcd24-c711-417b-9740-0473f225ef81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.329 187643 INFO os_vif [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:41:2a,bridge_name='br-int',has_traffic_filtering=True,id=b755f0b1-c231-4652-bc2c-71033179f14c,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb755f0b1-c2')
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.329 187643 INFO nova.virt.libvirt.driver [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Deleting instance files /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999_del
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.330 187643 INFO nova.virt.libvirt.driver [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Deletion of /var/lib/nova/instances/5282a8f9-2db6-47c7-bbcb-061973c8b999_del complete
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.342 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[09945f58-d16e-48fb-b90b-bbd35848ac74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.343 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7de6de25-934e-4408-8904-4032bea35e31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.356 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8312c0ef-b8b1-42f4-9cee-f700c51534d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404491, 'reachable_time': 38008, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213446, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.358 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.358 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[3d0d0eaf-d9d5-467b-a657-44081f40d5aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b12da8d\x2d3150\x2d4d44\x2db948\x2d8d49ddadedef.mount: Deactivated successfully.
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.360 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b755f0b1-c231-4652-bc2c-71033179f14c in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.361 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.362 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff91957-4429-4047-91a8-17907909820a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.362 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b755f0b1-c231-4652-bc2c-71033179f14c in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.364 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:09:37 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:09:37.364 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a3909a04-ce2e-48a3-abe5-d78049991977]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.388 187643 INFO nova.compute.manager [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Took 0.37 seconds to destroy the instance on the hypervisor.
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.389 187643 DEBUG oslo.service.loopingcall [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.389 187643 DEBUG nova.compute.manager [-] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.390 187643 DEBUG nova.network.neutron [-] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.461 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.547 187643 DEBUG nova.compute.manager [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Received event network-vif-plugged-69498eaa-e796-4cb8-9bcf-36a86a97cd89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.547 187643 DEBUG oslo_concurrency.lockutils [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.548 187643 DEBUG oslo_concurrency.lockutils [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.548 187643 DEBUG oslo_concurrency.lockutils [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "76696b0e-6027-4a27-8100-8528dd9c1fd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.549 187643 DEBUG nova.compute.manager [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] No waiting events found dispatching network-vif-plugged-69498eaa-e796-4cb8-9bcf-36a86a97cd89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.549 187643 WARNING nova.compute.manager [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Received unexpected event network-vif-plugged-69498eaa-e796-4cb8-9bcf-36a86a97cd89 for instance with vm_state deleted and task_state None.
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.549 187643 DEBUG nova.compute.manager [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-unplugged-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.549 187643 DEBUG oslo_concurrency.lockutils [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.550 187643 DEBUG oslo_concurrency.lockutils [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.550 187643 DEBUG oslo_concurrency.lockutils [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.550 187643 DEBUG nova.compute.manager [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] No waiting events found dispatching network-vif-unplugged-b755f0b1-c231-4652-bc2c-71033179f14c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.550 187643 DEBUG nova.compute.manager [req-75476d0e-0b3f-4250-84e0-8a277f5cb8a9 req-6300d954-bdb9-407f-a9d4-df4fd21af09f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-unplugged-b755f0b1-c231-4652-bc2c-71033179f14c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:09:37 compute-0 sshd-session[213447]: Connection closed by authenticating user root 165.227.79.48 port 54472 [preauth]
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.857 187643 DEBUG nova.network.neutron [-] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.886 187643 INFO nova.compute.manager [-] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Took 0.50 seconds to deallocate network for instance.
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.940 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:37 compute-0 nova_compute[187639]: 2026-02-23 11:09:37.940 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:38 compute-0 nova_compute[187639]: 2026-02-23 11:09:38.003 187643 DEBUG nova.compute.provider_tree [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:09:38 compute-0 nova_compute[187639]: 2026-02-23 11:09:38.023 187643 DEBUG nova.scheduler.client.report [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:09:38 compute-0 nova_compute[187639]: 2026-02-23 11:09:38.045 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:38 compute-0 nova_compute[187639]: 2026-02-23 11:09:38.071 187643 INFO nova.scheduler.client.report [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance 5282a8f9-2db6-47c7-bbcb-061973c8b999
Feb 23 11:09:38 compute-0 nova_compute[187639]: 2026-02-23 11:09:38.124 187643 DEBUG nova.compute.manager [req-3d3514f9-3fe2-488c-993c-b9a62e376c52 req-dd55428b-7653-4213-804b-7ad920bd074f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-deleted-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:38 compute-0 nova_compute[187639]: 2026-02-23 11:09:38.145 187643 DEBUG oslo_concurrency.lockutils [None req-94e3e792-7d10-44d7-b613-86faaaeaac98 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.682 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.683 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.684 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.684 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.684 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] No waiting events found dispatching network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.685 187643 WARNING nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received unexpected event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c for instance with vm_state deleted and task_state None.
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.685 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.685 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.686 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.686 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.686 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] No waiting events found dispatching network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.687 187643 WARNING nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received unexpected event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c for instance with vm_state deleted and task_state None.
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.687 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.687 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.688 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.688 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.688 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] No waiting events found dispatching network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.689 187643 WARNING nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received unexpected event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c for instance with vm_state deleted and task_state None.
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.689 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.689 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.690 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.690 187643 DEBUG oslo_concurrency.lockutils [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "5282a8f9-2db6-47c7-bbcb-061973c8b999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.690 187643 DEBUG nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] No waiting events found dispatching network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:09:39 compute-0 nova_compute[187639]: 2026-02-23 11:09:39.691 187643 WARNING nova.compute.manager [req-1aca9430-f688-46b2-a675-4edc153363a7 req-d0a9bc69-14c3-4da9-a5f7-0ef3f30cb95c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Received unexpected event network-vif-plugged-b755f0b1-c231-4652-bc2c-71033179f14c for instance with vm_state deleted and task_state None.
Feb 23 11:09:42 compute-0 nova_compute[187639]: 2026-02-23 11:09:42.346 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:42 compute-0 nova_compute[187639]: 2026-02-23 11:09:42.463 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:46 compute-0 podman[213450]: 2026-02-23 11:09:46.876811832 +0000 UTC m=+0.082803894 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:09:47 compute-0 nova_compute[187639]: 2026-02-23 11:09:47.385 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:47 compute-0 nova_compute[187639]: 2026-02-23 11:09:47.464 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:50 compute-0 nova_compute[187639]: 2026-02-23 11:09:50.275 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844975.2745147, 76696b0e-6027-4a27-8100-8528dd9c1fd7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:09:50 compute-0 nova_compute[187639]: 2026-02-23 11:09:50.275 187643 INFO nova.compute.manager [-] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] VM Stopped (Lifecycle Event)
Feb 23 11:09:50 compute-0 nova_compute[187639]: 2026-02-23 11:09:50.305 187643 DEBUG nova.compute.manager [None req-014ebc61-f017-43b4-b033-b5b54dd6c124 - - - - - -] [instance: 76696b0e-6027-4a27-8100-8528dd9c1fd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:09:52 compute-0 nova_compute[187639]: 2026-02-23 11:09:52.275 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771844977.2725525, 5282a8f9-2db6-47c7-bbcb-061973c8b999 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:09:52 compute-0 nova_compute[187639]: 2026-02-23 11:09:52.275 187643 INFO nova.compute.manager [-] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] VM Stopped (Lifecycle Event)
Feb 23 11:09:52 compute-0 nova_compute[187639]: 2026-02-23 11:09:52.308 187643 DEBUG nova.compute.manager [None req-efddc4d1-eec4-4c1c-8bc1-6b4419dbfa11 - - - - - -] [instance: 5282a8f9-2db6-47c7-bbcb-061973c8b999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:09:52 compute-0 nova_compute[187639]: 2026-02-23 11:09:52.421 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:52 compute-0 nova_compute[187639]: 2026-02-23 11:09:52.465 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:52 compute-0 podman[213475]: 2026-02-23 11:09:52.856312216 +0000 UTC m=+0.054711399 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:09:55 compute-0 sshd-session[213496]: Invalid user admin from 143.198.30.3 port 43942
Feb 23 11:09:55 compute-0 sshd-session[213496]: Connection closed by invalid user admin 143.198.30.3 port 43942 [preauth]
Feb 23 11:09:57 compute-0 nova_compute[187639]: 2026-02-23 11:09:57.468 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:09:57 compute-0 nova_compute[187639]: 2026-02-23 11:09:57.470 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:09:57 compute-0 nova_compute[187639]: 2026-02-23 11:09:57.470 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 11:09:57 compute-0 nova_compute[187639]: 2026-02-23 11:09:57.470 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:09:57 compute-0 nova_compute[187639]: 2026-02-23 11:09:57.473 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:09:57 compute-0 nova_compute[187639]: 2026-02-23 11:09:57.473 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:09:59 compute-0 nova_compute[187639]: 2026-02-23 11:09:59.694 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:09:59 compute-0 podman[197002]: time="2026-02-23T11:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:09:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:09:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 23 11:09:59 compute-0 podman[213499]: 2026-02-23 11:09:59.862546873 +0000 UTC m=+0.069474621 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 11:10:01 compute-0 openstack_network_exporter[199919]: ERROR   11:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:10:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:10:01 compute-0 openstack_network_exporter[199919]: ERROR   11:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:10:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.476 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.708 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.708 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.709 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:02 compute-0 nova_compute[187639]: 2026-02-23 11:10:02.709 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:10:03 compute-0 nova_compute[187639]: 2026-02-23 11:10:03.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:03 compute-0 podman[213526]: 2026-02-23 11:10:03.863878844 +0000 UTC m=+0.062678971 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-type=git, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 11:10:04 compute-0 nova_compute[187639]: 2026-02-23 11:10:04.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.743 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.743 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.744 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.744 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.946 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.948 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5842MB free_disk=73.2055549621582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.949 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:06 compute-0 nova_compute[187639]: 2026-02-23 11:10:06.950 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:07 compute-0 nova_compute[187639]: 2026-02-23 11:10:07.022 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:10:07 compute-0 nova_compute[187639]: 2026-02-23 11:10:07.023 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:10:07 compute-0 nova_compute[187639]: 2026-02-23 11:10:07.058 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:10:07 compute-0 nova_compute[187639]: 2026-02-23 11:10:07.078 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:10:07 compute-0 nova_compute[187639]: 2026-02-23 11:10:07.108 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:10:07 compute-0 nova_compute[187639]: 2026-02-23 11:10:07.109 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:07 compute-0 nova_compute[187639]: 2026-02-23 11:10:07.478 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:10:07 compute-0 ovn_controller[97601]: 2026-02-23T11:10:07Z|00142|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Feb 23 11:10:12 compute-0 nova_compute[187639]: 2026-02-23 11:10:12.480 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:12 compute-0 nova_compute[187639]: 2026-02-23 11:10:12.482 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:12.654 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:12.655 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:12.655 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:13 compute-0 nova_compute[187639]: 2026-02-23 11:10:13.110 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.650 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.650 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.674 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.771 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.771 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.780 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.781 187643 INFO nova.compute.claims [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.948 187643 DEBUG nova.compute.provider_tree [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.967 187643 DEBUG nova.scheduler.client.report [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.991 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:15 compute-0 nova_compute[187639]: 2026-02-23 11:10:15.991 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.059 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.060 187643 DEBUG nova.network.neutron [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.075 187643 INFO nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.089 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.174 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.175 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.176 187643 INFO nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Creating image(s)
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.176 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "/var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.176 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.177 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.192 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.256 187643 DEBUG nova.policy [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48814d91aad6418f9d55fc9967ed0087', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.271 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.273 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.273 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.295 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.375 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.377 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.405 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.406 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.407 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.450 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.452 187643 DEBUG nova.virt.disk.api [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Checking if we can resize image /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.452 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.502 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.503 187643 DEBUG nova.virt.disk.api [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Cannot resize image /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.503 187643 DEBUG nova.objects.instance [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'migration_context' on Instance uuid 0adcef24-ffcc-4db0-ae88-c24faaf87e3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.524 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.524 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Ensure instance console log exists: /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.525 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.525 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.526 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:16 compute-0 nova_compute[187639]: 2026-02-23 11:10:16.908 187643 DEBUG nova.network.neutron [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Successfully created port: 250dfcd2-0114-40ca-8ee7-4395debc5879 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:10:17 compute-0 nova_compute[187639]: 2026-02-23 11:10:17.484 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:10:17 compute-0 nova_compute[187639]: 2026-02-23 11:10:17.486 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:10:17 compute-0 nova_compute[187639]: 2026-02-23 11:10:17.486 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 11:10:17 compute-0 nova_compute[187639]: 2026-02-23 11:10:17.486 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:10:17 compute-0 nova_compute[187639]: 2026-02-23 11:10:17.523 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:17 compute-0 nova_compute[187639]: 2026-02-23 11:10:17.524 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:10:17 compute-0 podman[213564]: 2026-02-23 11:10:17.868534346 +0000 UTC m=+0.069064290 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:10:21 compute-0 sshd-session[213589]: Connection closed by authenticating user root 165.227.79.48 port 46794 [preauth]
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.206 187643 DEBUG nova.network.neutron [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Successfully updated port: 250dfcd2-0114-40ca-8ee7-4395debc5879 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.222 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.223 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquired lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.223 187643 DEBUG nova.network.neutron [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.321 187643 DEBUG nova.compute.manager [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-changed-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.321 187643 DEBUG nova.compute.manager [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Refreshing instance network info cache due to event network-changed-250dfcd2-0114-40ca-8ee7-4395debc5879. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.322 187643 DEBUG oslo_concurrency.lockutils [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:10:21 compute-0 nova_compute[187639]: 2026-02-23 11:10:21.785 187643 DEBUG nova.network.neutron [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:10:22 compute-0 nova_compute[187639]: 2026-02-23 11:10:22.525 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:10:22 compute-0 nova_compute[187639]: 2026-02-23 11:10:22.526 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:22 compute-0 nova_compute[187639]: 2026-02-23 11:10:22.526 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 11:10:22 compute-0 nova_compute[187639]: 2026-02-23 11:10:22.526 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:10:22 compute-0 nova_compute[187639]: 2026-02-23 11:10:22.527 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:10:22 compute-0 nova_compute[187639]: 2026-02-23 11:10:22.529 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.405 187643 DEBUG nova.network.neutron [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updating instance_info_cache with network_info: [{"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.427 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Releasing lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.427 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Instance network_info: |[{"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.428 187643 DEBUG oslo_concurrency.lockutils [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.428 187643 DEBUG nova.network.neutron [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Refreshing network info cache for port 250dfcd2-0114-40ca-8ee7-4395debc5879 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.431 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Start _get_guest_xml network_info=[{"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.437 187643 WARNING nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.445 187643 DEBUG nova.virt.libvirt.host [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.446 187643 DEBUG nova.virt.libvirt.host [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.450 187643 DEBUG nova.virt.libvirt.host [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.450 187643 DEBUG nova.virt.libvirt.host [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.451 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.452 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.452 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.452 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.453 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.453 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.453 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.454 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.454 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.454 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.455 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.455 187643 DEBUG nova.virt.hardware [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.459 187643 DEBUG nova.virt.libvirt.vif [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:10:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1990068294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1990068294',id=17,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-oaigj2bs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:10:16Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=0adcef24-ffcc-4db0-ae88-c24faaf87e3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.459 187643 DEBUG nova.network.os_vif_util [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.460 187643 DEBUG nova.network.os_vif_util [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.461 187643 DEBUG nova.objects.instance [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 0adcef24-ffcc-4db0-ae88-c24faaf87e3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.483 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <uuid>0adcef24-ffcc-4db0-ae88-c24faaf87e3d</uuid>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <name>instance-00000011</name>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteStrategies-server-1990068294</nova:name>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:10:23</nova:creationTime>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:user uuid="48814d91aad6418f9d55fc9967ed0087">tempest-TestExecuteStrategies-126537390-project-member</nova:user>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:project uuid="5dfbb0ac693b4065ada17052ebb303dd">tempest-TestExecuteStrategies-126537390</nova:project>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         <nova:port uuid="250dfcd2-0114-40ca-8ee7-4395debc5879">
Feb 23 11:10:23 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <system>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <entry name="serial">0adcef24-ffcc-4db0-ae88-c24faaf87e3d</entry>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <entry name="uuid">0adcef24-ffcc-4db0-ae88-c24faaf87e3d</entry>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </system>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <os>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   </os>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <features>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   </features>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk.config"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:c4:45:77"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <target dev="tap250dfcd2-01"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/console.log" append="off"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <video>
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </video>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:10:23 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:10:23 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:10:23 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:10:23 compute-0 nova_compute[187639]: </domain>
Feb 23 11:10:23 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.484 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Preparing to wait for external event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.485 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.485 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.485 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.486 187643 DEBUG nova.virt.libvirt.vif [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:10:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1990068294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1990068294',id=17,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-oaigj2bs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:10:16Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=0adcef24-ffcc-4db0-ae88-c24faaf87e3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.486 187643 DEBUG nova.network.os_vif_util [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.487 187643 DEBUG nova.network.os_vif_util [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.487 187643 DEBUG os_vif [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.488 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.488 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.489 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.492 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.493 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap250dfcd2-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.494 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap250dfcd2-01, col_values=(('external_ids', {'iface-id': '250dfcd2-0114-40ca-8ee7-4395debc5879', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:45:77', 'vm-uuid': '0adcef24-ffcc-4db0-ae88-c24faaf87e3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.496 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:23 compute-0 NetworkManager[57207]: <info>  [1771845023.4974] manager: (tap250dfcd2-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.498 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.503 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.504 187643 INFO os_vif [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01')
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.558 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.558 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.558 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No VIF found with MAC fa:16:3e:c4:45:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:10:23 compute-0 nova_compute[187639]: 2026-02-23 11:10:23.559 187643 INFO nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Using config drive
Feb 23 11:10:23 compute-0 podman[213593]: 2026-02-23 11:10:23.880386928 +0000 UTC m=+0.072890551 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.447 187643 INFO nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Creating config drive at /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk.config
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.452 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpuaxrro86 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.578 187643 DEBUG oslo_concurrency.processutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpuaxrro86" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:10:24 compute-0 kernel: tap250dfcd2-01: entered promiscuous mode
Feb 23 11:10:24 compute-0 NetworkManager[57207]: <info>  [1771845024.6457] manager: (tap250dfcd2-01): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Feb 23 11:10:24 compute-0 ovn_controller[97601]: 2026-02-23T11:10:24Z|00143|binding|INFO|Claiming lport 250dfcd2-0114-40ca-8ee7-4395debc5879 for this chassis.
Feb 23 11:10:24 compute-0 ovn_controller[97601]: 2026-02-23T11:10:24Z|00144|binding|INFO|250dfcd2-0114-40ca-8ee7-4395debc5879: Claiming fa:16:3e:c4:45:77 10.100.0.14
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.646 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.657 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:45:77 10.100.0.14'], port_security=['fa:16:3e:c4:45:77 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0adcef24-ffcc-4db0-ae88-c24faaf87e3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=250dfcd2-0114-40ca-8ee7-4395debc5879) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:10:24 compute-0 ovn_controller[97601]: 2026-02-23T11:10:24Z|00145|binding|INFO|Setting lport 250dfcd2-0114-40ca-8ee7-4395debc5879 ovn-installed in OVS
Feb 23 11:10:24 compute-0 ovn_controller[97601]: 2026-02-23T11:10:24Z|00146|binding|INFO|Setting lport 250dfcd2-0114-40ca-8ee7-4395debc5879 up in Southbound
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.659 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 250dfcd2-0114-40ca-8ee7-4395debc5879 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.659 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.662 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.664 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.674 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8145e5ff-d9a7-41d1-92ae-ab4175d23ac9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.675 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b12da8d-31 in ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.678 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b12da8d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.679 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0e18525d-c826-4297-a731-6366fdb8ae2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.680 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0e159c43-9701-4f42-84c7-42da411663b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 systemd-machined[156970]: New machine qemu-13-instance-00000011.
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.691 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[be4be5ef-f4ee-40b3-9afe-51295c8ca084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.705 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3c59e135-9400-411d-9d97-beaf4b398556]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 systemd-udevd[213633]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:10:24 compute-0 NetworkManager[57207]: <info>  [1771845024.7256] device (tap250dfcd2-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:10:24 compute-0 NetworkManager[57207]: <info>  [1771845024.7269] device (tap250dfcd2-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.730 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[20c1cedf-1c0d-4dc3-8326-73d4b30493ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 systemd-udevd[213637]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.736 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[be9181f1-fa8a-4033-80d3-961bb989fcd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 NetworkManager[57207]: <info>  [1771845024.7386] manager: (tap4b12da8d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.775 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[b304bb6a-eca3-4069-ac5f-a7125077fbf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.779 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[e950e415-d209-4357-9004-a60dcbff5128]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 NetworkManager[57207]: <info>  [1771845024.7989] device (tap4b12da8d-30): carrier: link connected
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.803 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[553e55ef-bd8f-4962-a9b6-625b4a5b536e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.818 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a36a7435-daa0-4242-a108-f93e28601966]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416896, 'reachable_time': 36248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213663, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.832 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e29e18a1-9fcb-4cca-a17e-5d8ea7be29fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8a8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416896, 'tstamp': 416896}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213664, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.845 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7bfabb-5a18-49e6-a881-426bd6182c7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416896, 'reachable_time': 36248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213665, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.876 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab9ee39-c429-484e-a78c-0e66cebbfa0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.922 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6179cf-9ef2-46fb-a288-10dcd8566482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.925 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.925 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.926 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.928 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:24 compute-0 kernel: tap4b12da8d-30: entered promiscuous mode
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.930 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:24 compute-0 NetworkManager[57207]: <info>  [1771845024.9323] manager: (tap4b12da8d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.934 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:10:24 compute-0 ovn_controller[97601]: 2026-02-23T11:10:24Z|00147|binding|INFO|Releasing lport 586378da-906d-4768-bab7-0954450c4a57 from this chassis (sb_readonly=0)
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.980 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.982 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.982 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb90340-573e-4ecc-b711-3c2e30c559de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.983 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:10:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:24.984 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'env', 'PROCESS_TAG=haproxy-4b12da8d-3150-4d44-b948-8d49ddadedef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b12da8d-3150-4d44-b948-8d49ddadedef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:10:24 compute-0 nova_compute[187639]: 2026-02-23 11:10:24.992 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.204 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845025.2037444, 0adcef24-ffcc-4db0-ae88-c24faaf87e3d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.204 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] VM Started (Lifecycle Event)
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.225 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.231 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845025.204077, 0adcef24-ffcc-4db0-ae88-c24faaf87e3d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.231 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] VM Paused (Lifecycle Event)
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.256 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.260 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:10:25 compute-0 podman[213704]: 2026-02-23 11:10:25.28331832 +0000 UTC m=+0.046089262 container create eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216)
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.283 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:10:25 compute-0 systemd[1]: Started libpod-conmon-eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328.scope.
Feb 23 11:10:25 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:10:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23e2babe6aca9ede6978a4f1adb027234be45521f45d5157baa743e78f5a0904/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:10:25 compute-0 podman[213704]: 2026-02-23 11:10:25.355637285 +0000 UTC m=+0.118408207 container init eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 11:10:25 compute-0 podman[213704]: 2026-02-23 11:10:25.260909256 +0000 UTC m=+0.023680118 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:10:25 compute-0 podman[213704]: 2026-02-23 11:10:25.361355837 +0000 UTC m=+0.124126719 container start eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 23 11:10:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [NOTICE]   (213723) : New worker (213725) forked
Feb 23 11:10:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [NOTICE]   (213723) : Loading success.
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.598 187643 DEBUG nova.compute.manager [req-89538c99-739c-47b8-8ae4-7e9caf08e812 req-90850e7a-0c01-4d86-9901-4cdddc069e13 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.599 187643 DEBUG oslo_concurrency.lockutils [req-89538c99-739c-47b8-8ae4-7e9caf08e812 req-90850e7a-0c01-4d86-9901-4cdddc069e13 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.599 187643 DEBUG oslo_concurrency.lockutils [req-89538c99-739c-47b8-8ae4-7e9caf08e812 req-90850e7a-0c01-4d86-9901-4cdddc069e13 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.599 187643 DEBUG oslo_concurrency.lockutils [req-89538c99-739c-47b8-8ae4-7e9caf08e812 req-90850e7a-0c01-4d86-9901-4cdddc069e13 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.599 187643 DEBUG nova.compute.manager [req-89538c99-739c-47b8-8ae4-7e9caf08e812 req-90850e7a-0c01-4d86-9901-4cdddc069e13 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Processing event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.600 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.603 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845025.6031814, 0adcef24-ffcc-4db0-ae88-c24faaf87e3d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.603 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] VM Resumed (Lifecycle Event)
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.605 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.608 187643 INFO nova.virt.libvirt.driver [-] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Instance spawned successfully.
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.608 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.629 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.632 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.632 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.632 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.633 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.633 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.633 187643 DEBUG nova.virt.libvirt.driver [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.636 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.674 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.690 187643 INFO nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Took 9.52 seconds to spawn the instance on the hypervisor.
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.690 187643 DEBUG nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.756 187643 INFO nova.compute.manager [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Took 10.02 seconds to build instance.
Feb 23 11:10:25 compute-0 nova_compute[187639]: 2026-02-23 11:10:25.777 187643 DEBUG oslo_concurrency.lockutils [None req-d4466a0a-2677-4fb1-afe6-2bd57909b2fc 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:26 compute-0 nova_compute[187639]: 2026-02-23 11:10:26.291 187643 DEBUG nova.network.neutron [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updated VIF entry in instance network info cache for port 250dfcd2-0114-40ca-8ee7-4395debc5879. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:10:26 compute-0 nova_compute[187639]: 2026-02-23 11:10:26.292 187643 DEBUG nova.network.neutron [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updating instance_info_cache with network_info: [{"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:10:26 compute-0 nova_compute[187639]: 2026-02-23 11:10:26.314 187643 DEBUG oslo_concurrency.lockutils [req-ed57cf50-8bec-413f-8322-3c74f4737bcd req-ce98b0fc-e371-4d95-8035-74c533886b64 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:10:27 compute-0 sshd-session[213734]: Invalid user admin from 143.198.30.3 port 38182
Feb 23 11:10:27 compute-0 sshd-session[213734]: Connection closed by invalid user admin 143.198.30.3 port 38182 [preauth]
Feb 23 11:10:27 compute-0 nova_compute[187639]: 2026-02-23 11:10:27.530 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:27 compute-0 nova_compute[187639]: 2026-02-23 11:10:27.705 187643 DEBUG nova.compute.manager [req-a358fe4f-a80d-42d1-9b27-04097dac000c req-3712ad12-52aa-491b-a092-3bdc86083b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:10:27 compute-0 nova_compute[187639]: 2026-02-23 11:10:27.706 187643 DEBUG oslo_concurrency.lockutils [req-a358fe4f-a80d-42d1-9b27-04097dac000c req-3712ad12-52aa-491b-a092-3bdc86083b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:10:27 compute-0 nova_compute[187639]: 2026-02-23 11:10:27.706 187643 DEBUG oslo_concurrency.lockutils [req-a358fe4f-a80d-42d1-9b27-04097dac000c req-3712ad12-52aa-491b-a092-3bdc86083b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:10:27 compute-0 nova_compute[187639]: 2026-02-23 11:10:27.707 187643 DEBUG oslo_concurrency.lockutils [req-a358fe4f-a80d-42d1-9b27-04097dac000c req-3712ad12-52aa-491b-a092-3bdc86083b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:10:27 compute-0 nova_compute[187639]: 2026-02-23 11:10:27.707 187643 DEBUG nova.compute.manager [req-a358fe4f-a80d-42d1-9b27-04097dac000c req-3712ad12-52aa-491b-a092-3bdc86083b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:10:27 compute-0 nova_compute[187639]: 2026-02-23 11:10:27.707 187643 WARNING nova.compute.manager [req-a358fe4f-a80d-42d1-9b27-04097dac000c req-3712ad12-52aa-491b-a092-3bdc86083b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received unexpected event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with vm_state active and task_state None.
Feb 23 11:10:28 compute-0 nova_compute[187639]: 2026-02-23 11:10:28.526 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:29 compute-0 podman[197002]: time="2026-02-23T11:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:10:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:10:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2630 "" "Go-http-client/1.1"
Feb 23 11:10:30 compute-0 podman[213736]: 2026-02-23 11:10:30.894287246 +0000 UTC m=+0.090794006 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 23 11:10:31 compute-0 openstack_network_exporter[199919]: ERROR   11:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:10:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:10:31 compute-0 openstack_network_exporter[199919]: ERROR   11:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:10:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:10:32 compute-0 nova_compute[187639]: 2026-02-23 11:10:32.531 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:33 compute-0 nova_compute[187639]: 2026-02-23 11:10:33.565 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:34 compute-0 podman[213764]: 2026-02-23 11:10:34.874573189 +0000 UTC m=+0.075972003 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, release=1770267347, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 23 11:10:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:36.776 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:10:36 compute-0 nova_compute[187639]: 2026-02-23 11:10:36.776 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:36 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:36.777 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:10:37 compute-0 nova_compute[187639]: 2026-02-23 11:10:37.531 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:38 compute-0 ovn_controller[97601]: 2026-02-23T11:10:38Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c4:45:77 10.100.0.14
Feb 23 11:10:38 compute-0 ovn_controller[97601]: 2026-02-23T11:10:38Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c4:45:77 10.100.0.14
Feb 23 11:10:38 compute-0 nova_compute[187639]: 2026-02-23 11:10:38.612 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:38 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:10:38.779 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:10:42 compute-0 nova_compute[187639]: 2026-02-23 11:10:42.533 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:43 compute-0 nova_compute[187639]: 2026-02-23 11:10:43.647 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:46 compute-0 sshd-session[213800]: Connection closed by 5.101.64.6 port 60023
Feb 23 11:10:47 compute-0 nova_compute[187639]: 2026-02-23 11:10:47.535 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:48 compute-0 nova_compute[187639]: 2026-02-23 11:10:48.683 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:48 compute-0 podman[213801]: 2026-02-23 11:10:48.846930676 +0000 UTC m=+0.050775656 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:10:52 compute-0 nova_compute[187639]: 2026-02-23 11:10:52.537 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:53 compute-0 nova_compute[187639]: 2026-02-23 11:10:53.738 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:54 compute-0 podman[213825]: 2026-02-23 11:10:54.855399839 +0000 UTC m=+0.057905804 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216)
Feb 23 11:10:57 compute-0 nova_compute[187639]: 2026-02-23 11:10:57.578 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:58 compute-0 nova_compute[187639]: 2026-02-23 11:10:58.770 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:10:59 compute-0 nova_compute[187639]: 2026-02-23 11:10:59.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:10:59 compute-0 podman[197002]: time="2026-02-23T11:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:10:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:10:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 23 11:11:00 compute-0 sshd-session[213845]: Invalid user admin from 143.198.30.3 port 58432
Feb 23 11:11:00 compute-0 sshd-session[213845]: Connection closed by invalid user admin 143.198.30.3 port 58432 [preauth]
Feb 23 11:11:01 compute-0 openstack_network_exporter[199919]: ERROR   11:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:11:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:11:01 compute-0 openstack_network_exporter[199919]: ERROR   11:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:11:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:11:01 compute-0 podman[213847]: 2026-02-23 11:11:01.89392398 +0000 UTC m=+0.086936293 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller)
Feb 23 11:11:02 compute-0 nova_compute[187639]: 2026-02-23 11:11:02.579 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:02 compute-0 nova_compute[187639]: 2026-02-23 11:11:02.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:02 compute-0 nova_compute[187639]: 2026-02-23 11:11:02.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:11:02 compute-0 sshd-session[213873]: Connection closed by 95.215.0.144 port 44244
Feb 23 11:11:02 compute-0 sshd-session[213874]: Unable to negotiate with 95.215.0.144 port 44246: no matching host key type found. Their offer: ssh-rsa,ssh-dss [preauth]
Feb 23 11:11:03 compute-0 nova_compute[187639]: 2026-02-23 11:11:03.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:03 compute-0 nova_compute[187639]: 2026-02-23 11:11:03.808 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:04 compute-0 nova_compute[187639]: 2026-02-23 11:11:04.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:04 compute-0 nova_compute[187639]: 2026-02-23 11:11:04.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:11:04 compute-0 nova_compute[187639]: 2026-02-23 11:11:04.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:11:05 compute-0 nova_compute[187639]: 2026-02-23 11:11:05.250 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:11:05 compute-0 nova_compute[187639]: 2026-02-23 11:11:05.250 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:11:05 compute-0 nova_compute[187639]: 2026-02-23 11:11:05.251 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:11:05 compute-0 nova_compute[187639]: 2026-02-23 11:11:05.251 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0adcef24-ffcc-4db0-ae88-c24faaf87e3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:11:05 compute-0 sshd-session[213876]: Connection closed by authenticating user root 165.227.79.48 port 34666 [preauth]
Feb 23 11:11:05 compute-0 podman[213878]: 2026-02-23 11:11:05.888117132 +0000 UTC m=+0.083194784 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, release=1770267347, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Feb 23 11:11:06 compute-0 nova_compute[187639]: 2026-02-23 11:11:06.672 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updating instance_info_cache with network_info: [{"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:11:06 compute-0 nova_compute[187639]: 2026-02-23 11:11:06.694 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:11:06 compute-0 nova_compute[187639]: 2026-02-23 11:11:06.694 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:11:06 compute-0 nova_compute[187639]: 2026-02-23 11:11:06.695 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:06 compute-0 nova_compute[187639]: 2026-02-23 11:11:06.695 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:06 compute-0 ovn_controller[97601]: 2026-02-23T11:11:06Z|00148|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 23 11:11:07 compute-0 nova_compute[187639]: 2026-02-23 11:11:07.580 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.731 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.732 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.732 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.733 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.816 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.841 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.880 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.881 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:11:08 compute-0 nova_compute[187639]: 2026-02-23 11:11:08.939 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.074 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.076 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5668MB free_disk=73.17622756958008GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.076 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.077 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.170 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance 0adcef24-ffcc-4db0-ae88-c24faaf87e3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.171 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.172 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.190 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.217 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.218 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.240 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.273 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.334 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.356 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.390 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:11:09 compute-0 nova_compute[187639]: 2026-02-23 11:11:09.390 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:12 compute-0 nova_compute[187639]: 2026-02-23 11:11:12.581 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:12.655 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:12.656 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:12.657 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:13 compute-0 nova_compute[187639]: 2026-02-23 11:11:13.391 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:11:13 compute-0 nova_compute[187639]: 2026-02-23 11:11:13.845 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:17 compute-0 nova_compute[187639]: 2026-02-23 11:11:17.584 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:18 compute-0 nova_compute[187639]: 2026-02-23 11:11:18.847 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:19 compute-0 podman[213906]: 2026-02-23 11:11:19.863937563 +0000 UTC m=+0.060331489 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 11:11:22 compute-0 nova_compute[187639]: 2026-02-23 11:11:22.585 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:22 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 23 11:11:23 compute-0 nova_compute[187639]: 2026-02-23 11:11:23.906 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:24 compute-0 nova_compute[187639]: 2026-02-23 11:11:24.921 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Check if temp file /var/lib/nova/instances/tmpzbo4f8x9 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 23 11:11:24 compute-0 nova_compute[187639]: 2026-02-23 11:11:24.922 187643 DEBUG nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzbo4f8x9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0adcef24-ffcc-4db0-ae88-c24faaf87e3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 23 11:11:25 compute-0 nova_compute[187639]: 2026-02-23 11:11:25.787 187643 DEBUG oslo_concurrency.processutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:11:25 compute-0 podman[213935]: 2026-02-23 11:11:25.85635497 +0000 UTC m=+0.059139005 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 23 11:11:25 compute-0 nova_compute[187639]: 2026-02-23 11:11:25.862 187643 DEBUG oslo_concurrency.processutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:11:25 compute-0 nova_compute[187639]: 2026-02-23 11:11:25.864 187643 DEBUG oslo_concurrency.processutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:11:25 compute-0 nova_compute[187639]: 2026-02-23 11:11:25.919 187643 DEBUG oslo_concurrency.processutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:11:27 compute-0 nova_compute[187639]: 2026-02-23 11:11:27.587 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:28 compute-0 sshd-session[213960]: Accepted publickey for nova from 192.168.122.101 port 60198 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 11:11:28 compute-0 nova_compute[187639]: 2026-02-23 11:11:28.940 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:28 compute-0 systemd-logind[808]: New session 37 of user nova.
Feb 23 11:11:28 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 11:11:28 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 11:11:28 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 11:11:29 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 11:11:29 compute-0 systemd[213964]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:11:29 compute-0 systemd[213964]: Queued start job for default target Main User Target.
Feb 23 11:11:29 compute-0 systemd[213964]: Created slice User Application Slice.
Feb 23 11:11:29 compute-0 systemd[213964]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:11:29 compute-0 systemd[213964]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 11:11:29 compute-0 systemd[213964]: Reached target Paths.
Feb 23 11:11:29 compute-0 systemd[213964]: Reached target Timers.
Feb 23 11:11:29 compute-0 systemd[213964]: Starting D-Bus User Message Bus Socket...
Feb 23 11:11:29 compute-0 systemd[213964]: Starting Create User's Volatile Files and Directories...
Feb 23 11:11:29 compute-0 systemd[213964]: Listening on D-Bus User Message Bus Socket.
Feb 23 11:11:29 compute-0 systemd[213964]: Reached target Sockets.
Feb 23 11:11:29 compute-0 systemd[213964]: Finished Create User's Volatile Files and Directories.
Feb 23 11:11:29 compute-0 systemd[213964]: Reached target Basic System.
Feb 23 11:11:29 compute-0 systemd[213964]: Reached target Main User Target.
Feb 23 11:11:29 compute-0 systemd[213964]: Startup finished in 136ms.
Feb 23 11:11:29 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 11:11:29 compute-0 systemd[1]: Started Session 37 of User nova.
Feb 23 11:11:29 compute-0 sshd-session[213960]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:11:29 compute-0 sshd-session[213979]: Received disconnect from 192.168.122.101 port 60198:11: disconnected by user
Feb 23 11:11:29 compute-0 sshd-session[213979]: Disconnected from user nova 192.168.122.101 port 60198
Feb 23 11:11:29 compute-0 sshd-session[213960]: pam_unix(sshd:session): session closed for user nova
Feb 23 11:11:29 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Feb 23 11:11:29 compute-0 systemd-logind[808]: Session 37 logged out. Waiting for processes to exit.
Feb 23 11:11:29 compute-0 systemd-logind[808]: Removed session 37.
Feb 23 11:11:29 compute-0 podman[197002]: time="2026-02-23T11:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:11:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:11:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 23 11:11:30 compute-0 nova_compute[187639]: 2026-02-23 11:11:30.643 187643 DEBUG nova.compute.manager [req-33811c09-a9e0-4437-878e-f3c4457ad0db req-5ed1da23-1609-4b80-b5b1-95cfab1f745a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:30 compute-0 nova_compute[187639]: 2026-02-23 11:11:30.644 187643 DEBUG oslo_concurrency.lockutils [req-33811c09-a9e0-4437-878e-f3c4457ad0db req-5ed1da23-1609-4b80-b5b1-95cfab1f745a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:30 compute-0 nova_compute[187639]: 2026-02-23 11:11:30.645 187643 DEBUG oslo_concurrency.lockutils [req-33811c09-a9e0-4437-878e-f3c4457ad0db req-5ed1da23-1609-4b80-b5b1-95cfab1f745a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:30 compute-0 nova_compute[187639]: 2026-02-23 11:11:30.645 187643 DEBUG oslo_concurrency.lockutils [req-33811c09-a9e0-4437-878e-f3c4457ad0db req-5ed1da23-1609-4b80-b5b1-95cfab1f745a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:30 compute-0 nova_compute[187639]: 2026-02-23 11:11:30.645 187643 DEBUG nova.compute.manager [req-33811c09-a9e0-4437-878e-f3c4457ad0db req-5ed1da23-1609-4b80-b5b1-95cfab1f745a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:30 compute-0 nova_compute[187639]: 2026-02-23 11:11:30.646 187643 DEBUG nova.compute.manager [req-33811c09-a9e0-4437-878e-f3c4457ad0db req-5ed1da23-1609-4b80-b5b1-95cfab1f745a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:11:31 compute-0 openstack_network_exporter[199919]: ERROR   11:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:11:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:11:31 compute-0 openstack_network_exporter[199919]: ERROR   11:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:11:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:11:31 compute-0 sshd-session[213981]: Invalid user admin from 143.198.30.3 port 60206
Feb 23 11:11:31 compute-0 sshd-session[213981]: Connection closed by invalid user admin 143.198.30.3 port 60206 [preauth]
Feb 23 11:11:31 compute-0 nova_compute[187639]: 2026-02-23 11:11:31.968 187643 INFO nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Took 6.05 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 23 11:11:31 compute-0 nova_compute[187639]: 2026-02-23 11:11:31.969 187643 DEBUG nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:11:31 compute-0 nova_compute[187639]: 2026-02-23 11:11:31.990 187643 DEBUG nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzbo4f8x9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0adcef24-ffcc-4db0-ae88-c24faaf87e3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(083d0b71-e6e6-4c93-82ac-160e2d39c050),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.013 187643 DEBUG nova.objects.instance [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid 0adcef24-ffcc-4db0-ae88-c24faaf87e3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.014 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.016 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.016 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.192 187643 DEBUG nova.virt.libvirt.vif [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:10:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1990068294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1990068294',id=17,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:10:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-oaigj2bs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:10:25Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=0adcef24-ffcc-4db0-ae88-c24faaf87e3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.193 187643 DEBUG nova.network.os_vif_util [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.194 187643 DEBUG nova.network.os_vif_util [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.195 187643 DEBUG nova.virt.libvirt.migration [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updating guest XML with vif config: <interface type="ethernet">
Feb 23 11:11:32 compute-0 nova_compute[187639]:   <mac address="fa:16:3e:c4:45:77"/>
Feb 23 11:11:32 compute-0 nova_compute[187639]:   <model type="virtio"/>
Feb 23 11:11:32 compute-0 nova_compute[187639]:   <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:11:32 compute-0 nova_compute[187639]:   <mtu size="1442"/>
Feb 23 11:11:32 compute-0 nova_compute[187639]:   <target dev="tap250dfcd2-01"/>
Feb 23 11:11:32 compute-0 nova_compute[187639]: </interface>
Feb 23 11:11:32 compute-0 nova_compute[187639]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.196 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.519 187643 DEBUG nova.virt.libvirt.migration [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.520 187643 INFO nova.virt.libvirt.migration [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.605 187643 INFO nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.623 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.740 187643 DEBUG nova.compute.manager [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.740 187643 DEBUG oslo_concurrency.lockutils [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.741 187643 DEBUG oslo_concurrency.lockutils [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.741 187643 DEBUG oslo_concurrency.lockutils [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.741 187643 DEBUG nova.compute.manager [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.741 187643 WARNING nova.compute.manager [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received unexpected event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with vm_state active and task_state migrating.
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.741 187643 DEBUG nova.compute.manager [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-changed-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.741 187643 DEBUG nova.compute.manager [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Refreshing instance network info cache due to event network-changed-250dfcd2-0114-40ca-8ee7-4395debc5879. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.741 187643 DEBUG oslo_concurrency.lockutils [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.742 187643 DEBUG oslo_concurrency.lockutils [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:11:32 compute-0 nova_compute[187639]: 2026-02-23 11:11:32.742 187643 DEBUG nova.network.neutron [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Refreshing network info cache for port 250dfcd2-0114-40ca-8ee7-4395debc5879 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:11:32 compute-0 podman[213983]: 2026-02-23 11:11:32.896292278 +0000 UTC m=+0.091342461 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.107 187643 DEBUG nova.virt.libvirt.migration [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.108 187643 DEBUG nova.virt.libvirt.migration [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.612 187643 DEBUG nova.virt.libvirt.migration [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.612 187643 DEBUG nova.virt.libvirt.migration [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.657 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845093.6562324, 0adcef24-ffcc-4db0-ae88-c24faaf87e3d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.657 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] VM Paused (Lifecycle Event)
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.675 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.679 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.699 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 23 11:11:33 compute-0 kernel: tap250dfcd2-01 (unregistering): left promiscuous mode
Feb 23 11:11:33 compute-0 NetworkManager[57207]: <info>  [1771845093.7609] device (tap250dfcd2-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:11:33 compute-0 ovn_controller[97601]: 2026-02-23T11:11:33Z|00149|binding|INFO|Releasing lport 250dfcd2-0114-40ca-8ee7-4395debc5879 from this chassis (sb_readonly=0)
Feb 23 11:11:33 compute-0 ovn_controller[97601]: 2026-02-23T11:11:33Z|00150|binding|INFO|Setting lport 250dfcd2-0114-40ca-8ee7-4395debc5879 down in Southbound
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.795 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:33 compute-0 ovn_controller[97601]: 2026-02-23T11:11:33Z|00151|binding|INFO|Removing iface tap250dfcd2-01 ovn-installed in OVS
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.797 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.801 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.801 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:45:77 10.100.0.14'], port_security=['fa:16:3e:c4:45:77 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '48738a31-ba59-4fc8-acf1-d1f474e97648'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0adcef24-ffcc-4db0-ae88-c24faaf87e3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=250dfcd2-0114-40ca-8ee7-4395debc5879) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.803 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 250dfcd2-0114-40ca-8ee7-4395debc5879 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.805 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.806 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7f4ae5-db5e-454a-8f78-61542bf68d2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.807 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace which is not needed anymore
Feb 23 11:11:33 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Feb 23 11:11:33 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 14.194s CPU time.
Feb 23 11:11:33 compute-0 systemd-machined[156970]: Machine qemu-13-instance-00000011 terminated.
Feb 23 11:11:33 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [NOTICE]   (213723) : haproxy version is 2.8.14-c23fe91
Feb 23 11:11:33 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [NOTICE]   (213723) : path to executable is /usr/sbin/haproxy
Feb 23 11:11:33 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [WARNING]  (213723) : Exiting Master process...
Feb 23 11:11:33 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [WARNING]  (213723) : Exiting Master process...
Feb 23 11:11:33 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [ALERT]    (213723) : Current worker (213725) exited with code 143 (Terminated)
Feb 23 11:11:33 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[213719]: [WARNING]  (213723) : All workers exited. Exiting... (0)
Feb 23 11:11:33 compute-0 systemd[1]: libpod-eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328.scope: Deactivated successfully.
Feb 23 11:11:33 compute-0 podman[214050]: 2026-02-23 11:11:33.910102734 +0000 UTC m=+0.037113886 container died eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 11:11:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328-userdata-shm.mount: Deactivated successfully.
Feb 23 11:11:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-23e2babe6aca9ede6978a4f1adb027234be45521f45d5157baa743e78f5a0904-merged.mount: Deactivated successfully.
Feb 23 11:11:33 compute-0 podman[214050]: 2026-02-23 11:11:33.941204641 +0000 UTC m=+0.068215783 container cleanup eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.941 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:33 compute-0 systemd[1]: libpod-conmon-eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328.scope: Deactivated successfully.
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.954 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.957 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.976 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.976 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.976 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 23 11:11:33 compute-0 podman[214081]: 2026-02-23 11:11:33.992854458 +0000 UTC m=+0.037792674 container remove eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.995 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f324ab20-9213-4c5d-9401-34905c727adf]: (4, ('Mon Feb 23 11:11:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328)\neaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328\nMon Feb 23 11:11:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (eaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328)\neaa1cbf8527b869f249f292a5903ebba50b563c6799e45693a0fdbbfb5c74328\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.997 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8da09351-deda-4fc6-b9b0-cf60a176c35a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:33.997 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:11:33 compute-0 nova_compute[187639]: 2026-02-23 11:11:33.999 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:33 compute-0 kernel: tap4b12da8d-30: left promiscuous mode
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.004 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:34.006 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3294fcb5-e406-4185-8cca-f8267430dd17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:34.018 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3123d069-0424-4a32-b736-fa458d0f4a99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:34.019 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[66369524-9da3-49e5-98b8-fc96d0df6f2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:34.029 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[07133921-61d7-4ec3-990e-eec08d062014]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416889, 'reachable_time': 42859, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214116, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:34.031 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:11:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:34.031 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[705d9963-cf3f-4301-a24d-5065e833531d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:11:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b12da8d\x2d3150\x2d4d44\x2db948\x2d8d49ddadedef.mount: Deactivated successfully.
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.114 187643 DEBUG nova.virt.libvirt.guest [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '0adcef24-ffcc-4db0-ae88-c24faaf87e3d' (instance-00000011) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.114 187643 INFO nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Migration operation has completed
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.114 187643 INFO nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] _post_live_migration() is started..
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.469 187643 DEBUG nova.compute.manager [req-579e8ab8-9df5-4b10-95c1-2723d1d5acd2 req-d754de67-8403-43bf-8168-15378f085fbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.469 187643 DEBUG oslo_concurrency.lockutils [req-579e8ab8-9df5-4b10-95c1-2723d1d5acd2 req-d754de67-8403-43bf-8168-15378f085fbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.469 187643 DEBUG oslo_concurrency.lockutils [req-579e8ab8-9df5-4b10-95c1-2723d1d5acd2 req-d754de67-8403-43bf-8168-15378f085fbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.469 187643 DEBUG oslo_concurrency.lockutils [req-579e8ab8-9df5-4b10-95c1-2723d1d5acd2 req-d754de67-8403-43bf-8168-15378f085fbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.470 187643 DEBUG nova.compute.manager [req-579e8ab8-9df5-4b10-95c1-2723d1d5acd2 req-d754de67-8403-43bf-8168-15378f085fbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.470 187643 DEBUG nova.compute.manager [req-579e8ab8-9df5-4b10-95c1-2723d1d5acd2 req-d754de67-8403-43bf-8168-15378f085fbc 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.784 187643 DEBUG nova.network.neutron [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updated VIF entry in instance network info cache for port 250dfcd2-0114-40ca-8ee7-4395debc5879. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.784 187643 DEBUG nova.network.neutron [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Updating instance_info_cache with network_info: [{"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.812 187643 DEBUG oslo_concurrency.lockutils [req-93bb1e99-2402-42e8-add4-9e572040440d req-ae62e5d6-3d69-4cba-b7fa-8ecf2e40c0be 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-0adcef24-ffcc-4db0-ae88-c24faaf87e3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.858 187643 DEBUG nova.compute.manager [req-80ff5401-1358-4049-9d85-ea1e343ea800 req-d39ade19-54c8-4f6f-ab0f-059069b8e916 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.859 187643 DEBUG oslo_concurrency.lockutils [req-80ff5401-1358-4049-9d85-ea1e343ea800 req-d39ade19-54c8-4f6f-ab0f-059069b8e916 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.859 187643 DEBUG oslo_concurrency.lockutils [req-80ff5401-1358-4049-9d85-ea1e343ea800 req-d39ade19-54c8-4f6f-ab0f-059069b8e916 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.859 187643 DEBUG oslo_concurrency.lockutils [req-80ff5401-1358-4049-9d85-ea1e343ea800 req-d39ade19-54c8-4f6f-ab0f-059069b8e916 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.859 187643 DEBUG nova.compute.manager [req-80ff5401-1358-4049-9d85-ea1e343ea800 req-d39ade19-54c8-4f6f-ab0f-059069b8e916 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:34 compute-0 nova_compute[187639]: 2026-02-23 11:11:34.859 187643 DEBUG nova.compute.manager [req-80ff5401-1358-4049-9d85-ea1e343ea800 req-d39ade19-54c8-4f6f-ab0f-059069b8e916 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-unplugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.294 187643 DEBUG nova.network.neutron [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Activated binding for port 250dfcd2-0114-40ca-8ee7-4395debc5879 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.294 187643 DEBUG nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.295 187643 DEBUG nova.virt.libvirt.vif [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:10:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1990068294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1990068294',id=17,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:10:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-oaigj2bs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:11:22Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=0adcef24-ffcc-4db0-ae88-c24faaf87e3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.295 187643 DEBUG nova.network.os_vif_util [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "250dfcd2-0114-40ca-8ee7-4395debc5879", "address": "fa:16:3e:c4:45:77", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap250dfcd2-01", "ovs_interfaceid": "250dfcd2-0114-40ca-8ee7-4395debc5879", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.296 187643 DEBUG nova.network.os_vif_util [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.296 187643 DEBUG os_vif [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.297 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.298 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap250dfcd2-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.349 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.352 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.355 187643 INFO os_vif [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:45:77,bridge_name='br-int',has_traffic_filtering=True,id=250dfcd2-0114-40ca-8ee7-4395debc5879,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap250dfcd2-01')
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.356 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.357 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.357 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.358 187643 DEBUG nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.359 187643 INFO nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Deleting instance files /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d_del
Feb 23 11:11:35 compute-0 nova_compute[187639]: 2026-02-23 11:11:35.360 187643 INFO nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Deletion of /var/lib/nova/instances/0adcef24-ffcc-4db0-ae88-c24faaf87e3d_del complete
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.556 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.557 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.557 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.557 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.558 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.558 187643 WARNING nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received unexpected event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with vm_state active and task_state migrating.
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.558 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.558 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.559 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.559 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.559 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.559 187643 WARNING nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received unexpected event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with vm_state active and task_state migrating.
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.559 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.560 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.560 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.560 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.560 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.561 187643 WARNING nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received unexpected event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with vm_state active and task_state migrating.
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.561 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.561 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.562 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.562 187643 DEBUG oslo_concurrency.lockutils [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.562 187643 DEBUG nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] No waiting events found dispatching network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:11:36 compute-0 nova_compute[187639]: 2026-02-23 11:11:36.562 187643 WARNING nova.compute.manager [req-83132e3e-5762-4eff-89fb-715bd8402f7e req-666bab22-80ba-481c-ae80-0d56b0547b24 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Received unexpected event network-vif-plugged-250dfcd2-0114-40ca-8ee7-4395debc5879 for instance with vm_state active and task_state migrating.
Feb 23 11:11:36 compute-0 podman[214117]: 2026-02-23 11:11:36.844489188 +0000 UTC m=+0.047927270 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 11:11:37 compute-0 nova_compute[187639]: 2026-02-23 11:11:37.668 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:39 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 11:11:39 compute-0 systemd[213964]: Activating special unit Exit the Session...
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped target Main User Target.
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped target Basic System.
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped target Paths.
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped target Sockets.
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped target Timers.
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 11:11:39 compute-0 systemd[213964]: Closed D-Bus User Message Bus Socket.
Feb 23 11:11:39 compute-0 systemd[213964]: Stopped Create User's Volatile Files and Directories.
Feb 23 11:11:39 compute-0 systemd[213964]: Removed slice User Application Slice.
Feb 23 11:11:39 compute-0 systemd[213964]: Reached target Shutdown.
Feb 23 11:11:39 compute-0 systemd[213964]: Finished Exit the Session.
Feb 23 11:11:39 compute-0 systemd[213964]: Reached target Exit the Session.
Feb 23 11:11:39 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 11:11:39 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 11:11:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 11:11:39 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 11:11:39 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 11:11:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 11:11:39 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.382 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.665 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.666 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.666 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0adcef24-ffcc-4db0-ae88-c24faaf87e3d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.691 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.692 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.692 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.692 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.865 187643 WARNING nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.866 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5803MB free_disk=73.20552444458008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.866 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.867 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.911 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration for instance 0adcef24-ffcc-4db0-ae88-c24faaf87e3d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.944 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.979 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration 083d0b71-e6e6-4c93-82ac-160e2d39c050 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.979 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:11:40 compute-0 nova_compute[187639]: 2026-02-23 11:11:40.980 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:11:41 compute-0 nova_compute[187639]: 2026-02-23 11:11:41.030 187643 DEBUG nova.compute.provider_tree [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:11:41 compute-0 nova_compute[187639]: 2026-02-23 11:11:41.044 187643 DEBUG nova.scheduler.client.report [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:11:41 compute-0 nova_compute[187639]: 2026-02-23 11:11:41.070 187643 DEBUG nova.compute.resource_tracker [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:11:41 compute-0 nova_compute[187639]: 2026-02-23 11:11:41.070 187643 DEBUG oslo_concurrency.lockutils [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:11:41 compute-0 nova_compute[187639]: 2026-02-23 11:11:41.079 187643 INFO nova.compute.manager [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 23 11:11:41 compute-0 nova_compute[187639]: 2026-02-23 11:11:41.176 187643 INFO nova.scheduler.client.report [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Deleted allocation for migration 083d0b71-e6e6-4c93-82ac-160e2d39c050
Feb 23 11:11:41 compute-0 nova_compute[187639]: 2026-02-23 11:11:41.177 187643 DEBUG nova.virt.libvirt.driver [None req-d837195c-5c2c-4fd4-862a-a4eca0192d42 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 23 11:11:42 compute-0 nova_compute[187639]: 2026-02-23 11:11:42.718 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:45 compute-0 nova_compute[187639]: 2026-02-23 11:11:45.386 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:45 compute-0 nova_compute[187639]: 2026-02-23 11:11:45.614 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:45.614 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:11:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:45.616 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:11:47 compute-0 nova_compute[187639]: 2026-02-23 11:11:47.755 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:48 compute-0 nova_compute[187639]: 2026-02-23 11:11:48.976 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845093.9742744, 0adcef24-ffcc-4db0-ae88-c24faaf87e3d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:11:48 compute-0 nova_compute[187639]: 2026-02-23 11:11:48.976 187643 INFO nova.compute.manager [-] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] VM Stopped (Lifecycle Event)
Feb 23 11:11:49 compute-0 nova_compute[187639]: 2026-02-23 11:11:49.067 187643 DEBUG nova.compute.manager [None req-3d849850-e8ee-44e4-9bf7-c62c7f41d1aa - - - - - -] [instance: 0adcef24-ffcc-4db0-ae88-c24faaf87e3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:11:49 compute-0 sshd-session[214141]: Connection closed by authenticating user root 165.227.79.48 port 46444 [preauth]
Feb 23 11:11:50 compute-0 nova_compute[187639]: 2026-02-23 11:11:50.438 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:50 compute-0 podman[214143]: 2026-02-23 11:11:50.957610759 +0000 UTC m=+0.077363683 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:11:52 compute-0 nova_compute[187639]: 2026-02-23 11:11:52.797 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:55 compute-0 nova_compute[187639]: 2026-02-23 11:11:55.483 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:11:55.619 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:11:56 compute-0 podman[214168]: 2026-02-23 11:11:56.855576205 +0000 UTC m=+0.057014879 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:11:57 compute-0 nova_compute[187639]: 2026-02-23 11:11:57.838 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:11:59 compute-0 podman[197002]: time="2026-02-23T11:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:11:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:11:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 23 11:12:00 compute-0 nova_compute[187639]: 2026-02-23 11:12:00.521 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:00 compute-0 nova_compute[187639]: 2026-02-23 11:12:00.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:01 compute-0 openstack_network_exporter[199919]: ERROR   11:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:12:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:12:01 compute-0 openstack_network_exporter[199919]: ERROR   11:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:12:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:12:02 compute-0 nova_compute[187639]: 2026-02-23 11:12:02.841 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:03 compute-0 nova_compute[187639]: 2026-02-23 11:12:03.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:03 compute-0 podman[214188]: 2026-02-23 11:12:03.926521389 +0000 UTC m=+0.129400150 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:12:04 compute-0 sshd-session[214214]: Invalid user admin from 143.198.30.3 port 43736
Feb 23 11:12:04 compute-0 sshd-session[214214]: Connection closed by invalid user admin 143.198.30.3 port 43736 [preauth]
Feb 23 11:12:04 compute-0 nova_compute[187639]: 2026-02-23 11:12:04.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:04 compute-0 nova_compute[187639]: 2026-02-23 11:12:04.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:12:04 compute-0 nova_compute[187639]: 2026-02-23 11:12:04.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:12:04 compute-0 nova_compute[187639]: 2026-02-23 11:12:04.710 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:12:04 compute-0 nova_compute[187639]: 2026-02-23 11:12:04.711 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:04 compute-0 nova_compute[187639]: 2026-02-23 11:12:04.711 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:12:05 compute-0 nova_compute[187639]: 2026-02-23 11:12:05.527 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:05 compute-0 nova_compute[187639]: 2026-02-23 11:12:05.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:06 compute-0 nova_compute[187639]: 2026-02-23 11:12:06.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:07 compute-0 podman[214216]: 2026-02-23 11:12:07.842443031 +0000 UTC m=+0.047321334 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, name=ubi9/ubi-minimal, version=9.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 11:12:07 compute-0 nova_compute[187639]: 2026-02-23 11:12:07.842 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.717 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.718 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.875 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.876 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5823MB free_disk=73.2055435180664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.876 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.876 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.940 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.941 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.969 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.981 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.983 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:12:08 compute-0 nova_compute[187639]: 2026-02-23 11:12:08.983 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:10 compute-0 nova_compute[187639]: 2026-02-23 11:12:10.566 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:12.656 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:12.656 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:12.656 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:12 compute-0 nova_compute[187639]: 2026-02-23 11:12:12.887 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:13 compute-0 nova_compute[187639]: 2026-02-23 11:12:13.934 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:13 compute-0 nova_compute[187639]: 2026-02-23 11:12:13.935 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:13 compute-0 nova_compute[187639]: 2026-02-23 11:12:13.956 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.070 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.071 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.079 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.080 187643 INFO nova.compute.claims [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.190 187643 DEBUG nova.compute.provider_tree [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.205 187643 DEBUG nova.scheduler.client.report [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.230 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.231 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.288 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.289 187643 DEBUG nova.network.neutron [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.307 187643 INFO nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.322 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.441 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.442 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.442 187643 INFO nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Creating image(s)
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.442 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "/var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.443 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.443 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.457 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.474 187643 DEBUG nova.policy [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48814d91aad6418f9d55fc9967ed0087', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.504 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.505 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.506 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.529 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.594 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.595 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.622 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.623 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.624 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.701 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.702 187643 DEBUG nova.virt.disk.api [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Checking if we can resize image /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.702 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.759 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.760 187643 DEBUG nova.virt.disk.api [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Cannot resize image /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.761 187643 DEBUG nova.objects.instance [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'migration_context' on Instance uuid a3bcc6f3-c040-4f37-bc36-53e02f8bda4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.780 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.780 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Ensure instance console log exists: /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.781 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.781 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.781 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:14 compute-0 nova_compute[187639]: 2026-02-23 11:12:14.984 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:15 compute-0 nova_compute[187639]: 2026-02-23 11:12:15.353 187643 DEBUG nova.network.neutron [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Successfully created port: 85f079eb-024d-4372-ab68-03414c9d3302 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:12:15 compute-0 nova_compute[187639]: 2026-02-23 11:12:15.589 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.031 187643 DEBUG nova.network.neutron [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Successfully updated port: 85f079eb-024d-4372-ab68-03414c9d3302 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.049 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.050 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquired lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.050 187643 DEBUG nova.network.neutron [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.115 187643 DEBUG nova.compute.manager [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-changed-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.115 187643 DEBUG nova.compute.manager [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Refreshing instance network info cache due to event network-changed-85f079eb-024d-4372-ab68-03414c9d3302. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.115 187643 DEBUG oslo_concurrency.lockutils [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:12:16 compute-0 nova_compute[187639]: 2026-02-23 11:12:16.276 187643 DEBUG nova.network.neutron [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.183 187643 DEBUG nova.network.neutron [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updating instance_info_cache with network_info: [{"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.200 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Releasing lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.201 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Instance network_info: |[{"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.201 187643 DEBUG oslo_concurrency.lockutils [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.201 187643 DEBUG nova.network.neutron [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Refreshing network info cache for port 85f079eb-024d-4372-ab68-03414c9d3302 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.203 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Start _get_guest_xml network_info=[{"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.206 187643 WARNING nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.210 187643 DEBUG nova.virt.libvirt.host [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.210 187643 DEBUG nova.virt.libvirt.host [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.212 187643 DEBUG nova.virt.libvirt.host [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.213 187643 DEBUG nova.virt.libvirt.host [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.214 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.214 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.214 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.215 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.215 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.215 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.215 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.216 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.216 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.216 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.216 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.217 187643 DEBUG nova.virt.hardware [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.219 187643 DEBUG nova.virt.libvirt.vif [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:12:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1946849368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1946849368',id=19,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-yrk88hjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:12:14Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=a3bcc6f3-c040-4f37-bc36-53e02f8bda4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.219 187643 DEBUG nova.network.os_vif_util [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.220 187643 DEBUG nova.network.os_vif_util [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.221 187643 DEBUG nova.objects.instance [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'pci_devices' on Instance uuid a3bcc6f3-c040-4f37-bc36-53e02f8bda4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.253 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <uuid>a3bcc6f3-c040-4f37-bc36-53e02f8bda4e</uuid>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <name>instance-00000013</name>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteStrategies-server-1946849368</nova:name>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:12:17</nova:creationTime>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:user uuid="48814d91aad6418f9d55fc9967ed0087">tempest-TestExecuteStrategies-126537390-project-member</nova:user>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:project uuid="5dfbb0ac693b4065ada17052ebb303dd">tempest-TestExecuteStrategies-126537390</nova:project>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         <nova:port uuid="85f079eb-024d-4372-ab68-03414c9d3302">
Feb 23 11:12:17 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <system>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <entry name="serial">a3bcc6f3-c040-4f37-bc36-53e02f8bda4e</entry>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <entry name="uuid">a3bcc6f3-c040-4f37-bc36-53e02f8bda4e</entry>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </system>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <os>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   </os>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <features>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   </features>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk.config"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:b0:86:4a"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <target dev="tap85f079eb-02"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/console.log" append="off"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <video>
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </video>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:12:17 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:12:17 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:12:17 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:12:17 compute-0 nova_compute[187639]: </domain>
Feb 23 11:12:17 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.253 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Preparing to wait for external event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.254 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.254 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.254 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.255 187643 DEBUG nova.virt.libvirt.vif [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:12:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1946849368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1946849368',id=19,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-yrk88hjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:12:14Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=a3bcc6f3-c040-4f37-bc36-53e02f8bda4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.255 187643 DEBUG nova.network.os_vif_util [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.255 187643 DEBUG nova.network.os_vif_util [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.256 187643 DEBUG os_vif [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.256 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.256 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.257 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.258 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.259 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85f079eb-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.259 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap85f079eb-02, col_values=(('external_ids', {'iface-id': '85f079eb-024d-4372-ab68-03414c9d3302', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:86:4a', 'vm-uuid': 'a3bcc6f3-c040-4f37-bc36-53e02f8bda4e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.272 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 NetworkManager[57207]: <info>  [1771845137.2737] manager: (tap85f079eb-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.274 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.279 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.280 187643 INFO os_vif [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02')
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.371 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.372 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.372 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No VIF found with MAC fa:16:3e:b0:86:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.373 187643 INFO nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Using config drive
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.702 187643 INFO nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Creating config drive at /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk.config
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.706 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpifea9qvn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.823 187643 DEBUG oslo_concurrency.processutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpifea9qvn" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:12:17 compute-0 kernel: tap85f079eb-02: entered promiscuous mode
Feb 23 11:12:17 compute-0 NetworkManager[57207]: <info>  [1771845137.8703] manager: (tap85f079eb-02): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.872 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 ovn_controller[97601]: 2026-02-23T11:12:17Z|00152|binding|INFO|Claiming lport 85f079eb-024d-4372-ab68-03414c9d3302 for this chassis.
Feb 23 11:12:17 compute-0 ovn_controller[97601]: 2026-02-23T11:12:17Z|00153|binding|INFO|85f079eb-024d-4372-ab68-03414c9d3302: Claiming fa:16:3e:b0:86:4a 10.100.0.9
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.878 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:86:4a 10.100.0.9'], port_security=['fa:16:3e:b0:86:4a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a3bcc6f3-c040-4f37-bc36-53e02f8bda4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=85f079eb-024d-4372-ab68-03414c9d3302) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.879 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 85f079eb-024d-4372-ab68-03414c9d3302 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:12:17 compute-0 ovn_controller[97601]: 2026-02-23T11:12:17Z|00154|binding|INFO|Setting lport 85f079eb-024d-4372-ab68-03414c9d3302 ovn-installed in OVS
Feb 23 11:12:17 compute-0 ovn_controller[97601]: 2026-02-23T11:12:17Z|00155|binding|INFO|Setting lport 85f079eb-024d-4372-ab68-03414c9d3302 up in Southbound
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.882 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.883 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.887 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 nova_compute[187639]: 2026-02-23 11:12:17.890 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.890 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a438bb4e-923c-4c01-a34c-2d1c66d0bc97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.892 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b12da8d-31 in ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.895 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b12da8d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.895 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ad249b7b-dd5c-4a3e-9ca6-95238e2f51c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.896 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa431c9-442b-4f96-a608-8e2933ee780e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 systemd-machined[156970]: New machine qemu-14-instance-00000013.
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.905 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[723b8567-e78d-471e-a02c-c42e3228c86d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.915 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e14126c6-4d95-46eb-a07d-952975ef2630]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Feb 23 11:12:17 compute-0 systemd-udevd[214276]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:12:17 compute-0 NetworkManager[57207]: <info>  [1771845137.9345] device (tap85f079eb-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:12:17 compute-0 NetworkManager[57207]: <info>  [1771845137.9353] device (tap85f079eb-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.945 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2a4216-869a-4602-ae8c-5d01c1de6c26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 systemd-udevd[214279]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:12:17 compute-0 NetworkManager[57207]: <info>  [1771845137.9503] manager: (tap4b12da8d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.949 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8468e9f7-ddc1-457f-acf3-21227e6e3140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.972 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[3b9b2443-661f-4823-9561-7150d9396da7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.975 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b53f3f-9a7c-4039-b078-c182bfc22c66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:17 compute-0 NetworkManager[57207]: <info>  [1771845137.9951] device (tap4b12da8d-30): carrier: link connected
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:17.999 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[a1df8783-5343-44c7-b916-024e4385b702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.012 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[04b6dd60-1de0-4e7d-8660-cec47200d1d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428216, 'reachable_time': 37373, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214305, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.025 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f650e431-5bf9-4c59-b47e-636b93d64d9e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8a8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428216, 'tstamp': 428216}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214306, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.040 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[c66892bf-0412-432c-857d-204e51308002]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428216, 'reachable_time': 37373, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214307, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.061 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[37997d0f-10aa-40d6-aee3-1fa7d8c5b6cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.098 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[c2640d46-f9c0-497e-957b-ff3987643012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.099 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.100 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.100 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:12:18 compute-0 kernel: tap4b12da8d-30: entered promiscuous mode
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.102 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:18 compute-0 NetworkManager[57207]: <info>  [1771845138.1024] manager: (tap4b12da8d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.106 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.107 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:18 compute-0 ovn_controller[97601]: 2026-02-23T11:12:18Z|00156|binding|INFO|Releasing lport 586378da-906d-4768-bab7-0954450c4a57 from this chassis (sb_readonly=0)
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.109 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.110 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.110 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[267f4f87-bd35-478c-8068-dd4c847ba25a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.111 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:12:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:12:18.112 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'env', 'PROCESS_TAG=haproxy-4b12da8d-3150-4d44-b948-8d49ddadedef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b12da8d-3150-4d44-b948-8d49ddadedef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.222 187643 DEBUG nova.compute.manager [req-67ae2883-51d8-4242-89cd-20f640cbfdf5 req-7423c6c8-83c5-4e9a-8215-194f63e4352e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.223 187643 DEBUG oslo_concurrency.lockutils [req-67ae2883-51d8-4242-89cd-20f640cbfdf5 req-7423c6c8-83c5-4e9a-8215-194f63e4352e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.223 187643 DEBUG oslo_concurrency.lockutils [req-67ae2883-51d8-4242-89cd-20f640cbfdf5 req-7423c6c8-83c5-4e9a-8215-194f63e4352e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.223 187643 DEBUG oslo_concurrency.lockutils [req-67ae2883-51d8-4242-89cd-20f640cbfdf5 req-7423c6c8-83c5-4e9a-8215-194f63e4352e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.224 187643 DEBUG nova.compute.manager [req-67ae2883-51d8-4242-89cd-20f640cbfdf5 req-7423c6c8-83c5-4e9a-8215-194f63e4352e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Processing event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.263 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845138.262728, a3bcc6f3-c040-4f37-bc36-53e02f8bda4e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.263 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] VM Started (Lifecycle Event)
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.265 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.270 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.273 187643 INFO nova.virt.libvirt.driver [-] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Instance spawned successfully.
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.273 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.282 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.294 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.299 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.299 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.300 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.300 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.301 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.301 187643 DEBUG nova.virt.libvirt.driver [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.345 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.345 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845138.263511, a3bcc6f3-c040-4f37-bc36-53e02f8bda4e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.345 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] VM Paused (Lifecycle Event)
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.362 187643 INFO nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Took 3.92 seconds to spawn the instance on the hypervisor.
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.363 187643 DEBUG nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.364 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.369 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845138.2699537, a3bcc6f3-c040-4f37-bc36-53e02f8bda4e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.370 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] VM Resumed (Lifecycle Event)
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.407 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.409 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.428 187643 INFO nova.compute.manager [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Took 4.39 seconds to build instance.
Feb 23 11:12:18 compute-0 podman[214346]: 2026-02-23 11:12:18.450182965 +0000 UTC m=+0.068183492 container create d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.459 187643 DEBUG oslo_concurrency.lockutils [None req-ac5ad03e-3ed8-4bab-b1a1-31e42c0f78d9 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:18 compute-0 systemd[1]: Started libpod-conmon-d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0.scope.
Feb 23 11:12:18 compute-0 podman[214346]: 2026-02-23 11:12:18.402870982 +0000 UTC m=+0.020871519 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:12:18 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:12:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4964b78125e4f2f795ce896e7112eeced38cff87fa01fee56674de9b93719d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:12:18 compute-0 podman[214346]: 2026-02-23 11:12:18.531226014 +0000 UTC m=+0.149226551 container init d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 11:12:18 compute-0 podman[214346]: 2026-02-23 11:12:18.534647504 +0000 UTC m=+0.152648021 container start d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 23 11:12:18 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [NOTICE]   (214365) : New worker (214367) forked
Feb 23 11:12:18 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [NOTICE]   (214365) : Loading success.
Feb 23 11:12:18 compute-0 nova_compute[187639]: 2026-02-23 11:12:18.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:19 compute-0 nova_compute[187639]: 2026-02-23 11:12:19.352 187643 DEBUG nova.network.neutron [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updated VIF entry in instance network info cache for port 85f079eb-024d-4372-ab68-03414c9d3302. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:12:19 compute-0 nova_compute[187639]: 2026-02-23 11:12:19.353 187643 DEBUG nova.network.neutron [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updating instance_info_cache with network_info: [{"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:12:19 compute-0 nova_compute[187639]: 2026-02-23 11:12:19.382 187643 DEBUG oslo_concurrency.lockutils [req-94c04f4f-080a-4997-a4fa-9cd6e03fc332 req-d1e18130-53e5-484b-be7a-fc7278083cb2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:12:20 compute-0 nova_compute[187639]: 2026-02-23 11:12:20.364 187643 DEBUG nova.compute.manager [req-070aed59-d582-4791-ada0-454072161c55 req-58b37b16-8a8c-4906-b269-c25a72b5e810 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:12:20 compute-0 nova_compute[187639]: 2026-02-23 11:12:20.364 187643 DEBUG oslo_concurrency.lockutils [req-070aed59-d582-4791-ada0-454072161c55 req-58b37b16-8a8c-4906-b269-c25a72b5e810 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:12:20 compute-0 nova_compute[187639]: 2026-02-23 11:12:20.365 187643 DEBUG oslo_concurrency.lockutils [req-070aed59-d582-4791-ada0-454072161c55 req-58b37b16-8a8c-4906-b269-c25a72b5e810 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:12:20 compute-0 nova_compute[187639]: 2026-02-23 11:12:20.365 187643 DEBUG oslo_concurrency.lockutils [req-070aed59-d582-4791-ada0-454072161c55 req-58b37b16-8a8c-4906-b269-c25a72b5e810 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:12:20 compute-0 nova_compute[187639]: 2026-02-23 11:12:20.365 187643 DEBUG nova.compute.manager [req-070aed59-d582-4791-ada0-454072161c55 req-58b37b16-8a8c-4906-b269-c25a72b5e810 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:12:20 compute-0 nova_compute[187639]: 2026-02-23 11:12:20.365 187643 WARNING nova.compute.manager [req-070aed59-d582-4791-ada0-454072161c55 req-58b37b16-8a8c-4906-b269-c25a72b5e810 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received unexpected event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with vm_state active and task_state None.
Feb 23 11:12:21 compute-0 podman[214376]: 2026-02-23 11:12:21.836794811 +0000 UTC m=+0.043227277 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:12:22 compute-0 nova_compute[187639]: 2026-02-23 11:12:22.274 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:22 compute-0 nova_compute[187639]: 2026-02-23 11:12:22.893 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:27 compute-0 nova_compute[187639]: 2026-02-23 11:12:27.277 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:27 compute-0 podman[214401]: 2026-02-23 11:12:27.853624341 +0000 UTC m=+0.052320916 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Feb 23 11:12:27 compute-0 nova_compute[187639]: 2026-02-23 11:12:27.893 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:29 compute-0 podman[197002]: time="2026-02-23T11:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:12:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:12:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 23 11:12:31 compute-0 ovn_controller[97601]: 2026-02-23T11:12:31Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:86:4a 10.100.0.9
Feb 23 11:12:31 compute-0 ovn_controller[97601]: 2026-02-23T11:12:31Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:86:4a 10.100.0.9
Feb 23 11:12:31 compute-0 openstack_network_exporter[199919]: ERROR   11:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:12:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:12:31 compute-0 openstack_network_exporter[199919]: ERROR   11:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:12:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:12:32 compute-0 nova_compute[187639]: 2026-02-23 11:12:32.279 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:32 compute-0 nova_compute[187639]: 2026-02-23 11:12:32.894 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:34 compute-0 sshd-session[214435]: Connection closed by authenticating user root 165.227.79.48 port 33588 [preauth]
Feb 23 11:12:34 compute-0 podman[214437]: 2026-02-23 11:12:34.877035907 +0000 UTC m=+0.085337653 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller)
Feb 23 11:12:36 compute-0 sshd-session[214464]: Invalid user admin from 143.198.30.3 port 42222
Feb 23 11:12:36 compute-0 sshd-session[214464]: Connection closed by invalid user admin 143.198.30.3 port 42222 [preauth]
Feb 23 11:12:37 compute-0 nova_compute[187639]: 2026-02-23 11:12:37.282 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:37 compute-0 nova_compute[187639]: 2026-02-23 11:12:37.896 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:38 compute-0 podman[214466]: 2026-02-23 11:12:38.885152293 +0000 UTC m=+0.083018072 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, architecture=x86_64, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 11:12:42 compute-0 nova_compute[187639]: 2026-02-23 11:12:42.285 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:42 compute-0 nova_compute[187639]: 2026-02-23 11:12:42.948 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:47 compute-0 nova_compute[187639]: 2026-02-23 11:12:47.288 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:47 compute-0 nova_compute[187639]: 2026-02-23 11:12:47.996 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:52 compute-0 nova_compute[187639]: 2026-02-23 11:12:52.291 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:52 compute-0 podman[214488]: 2026-02-23 11:12:52.845747109 +0000 UTC m=+0.048118615 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 11:12:52 compute-0 nova_compute[187639]: 2026-02-23 11:12:52.998 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:56 compute-0 ovn_controller[97601]: 2026-02-23T11:12:56Z|00157|memory_trim|INFO|Detected inactivity (last active 30022 ms ago): trimming memory
Feb 23 11:12:57 compute-0 nova_compute[187639]: 2026-02-23 11:12:57.293 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:57 compute-0 nova_compute[187639]: 2026-02-23 11:12:57.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:12:57 compute-0 nova_compute[187639]: 2026-02-23 11:12:57.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 11:12:57 compute-0 nova_compute[187639]: 2026-02-23 11:12:57.708 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 11:12:58 compute-0 nova_compute[187639]: 2026-02-23 11:12:58.050 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:12:58 compute-0 podman[214515]: 2026-02-23 11:12:58.851543438 +0000 UTC m=+0.048108085 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:12:59 compute-0 podman[197002]: time="2026-02-23T11:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:12:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:12:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2634 "" "Go-http-client/1.1"
Feb 23 11:13:01 compute-0 openstack_network_exporter[199919]: ERROR   11:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:13:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:13:01 compute-0 openstack_network_exporter[199919]: ERROR   11:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:13:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:13:01 compute-0 nova_compute[187639]: 2026-02-23 11:13:01.708 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:02 compute-0 nova_compute[187639]: 2026-02-23 11:13:02.295 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:03 compute-0 nova_compute[187639]: 2026-02-23 11:13:03.051 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:05 compute-0 nova_compute[187639]: 2026-02-23 11:13:05.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:05 compute-0 nova_compute[187639]: 2026-02-23 11:13:05.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:13:05 compute-0 nova_compute[187639]: 2026-02-23 11:13:05.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:13:05 compute-0 podman[214535]: 2026-02-23 11:13:05.881120144 +0000 UTC m=+0.082740035 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:13:06 compute-0 nova_compute[187639]: 2026-02-23 11:13:06.299 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:13:06 compute-0 nova_compute[187639]: 2026-02-23 11:13:06.299 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:13:06 compute-0 nova_compute[187639]: 2026-02-23 11:13:06.299 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:13:06 compute-0 nova_compute[187639]: 2026-02-23 11:13:06.300 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid a3bcc6f3-c040-4f37-bc36-53e02f8bda4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:13:07 compute-0 nova_compute[187639]: 2026-02-23 11:13:07.297 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.063 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.369 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updating instance_info_cache with network_info: [{"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.392 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.393 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.394 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.394 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.395 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.395 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:08 compute-0 nova_compute[187639]: 2026-02-23 11:13:08.395 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:13:08 compute-0 sshd-session[214561]: Invalid user admin from 143.198.30.3 port 34384
Feb 23 11:13:08 compute-0 sshd-session[214561]: Connection closed by invalid user admin 143.198.30.3 port 34384 [preauth]
Feb 23 11:13:09 compute-0 podman[214563]: 2026-02-23 11:13:09.903449251 +0000 UTC m=+0.098967621 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.openshift.expose-services=, version=9.7, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public)
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.727 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.728 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.729 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.729 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.809 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.853 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.854 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:13:10 compute-0 nova_compute[187639]: 2026-02-23 11:13:10.908 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.031 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.032 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5662MB free_disk=73.17621612548828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.033 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.033 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.096 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance a3bcc6f3-c040-4f37-bc36-53e02f8bda4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.096 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.097 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.181 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.199 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.223 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:13:11 compute-0 nova_compute[187639]: 2026-02-23 11:13:11.223 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:12 compute-0 nova_compute[187639]: 2026-02-23 11:13:12.299 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:12.657 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:12.658 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:12.658 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:13 compute-0 nova_compute[187639]: 2026-02-23 11:13:13.065 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:13 compute-0 nova_compute[187639]: 2026-02-23 11:13:13.302 187643 DEBUG nova.compute.manager [None req-f217463d-5a21-4c72-8af1-a2504b3069d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 23 11:13:13 compute-0 nova_compute[187639]: 2026-02-23 11:13:13.359 187643 DEBUG nova.compute.provider_tree [None req-f217463d-5a21-4c72-8af1-a2504b3069d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 24 to 28 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 11:13:16 compute-0 nova_compute[187639]: 2026-02-23 11:13:16.222 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:17 compute-0 nova_compute[187639]: 2026-02-23 11:13:17.254 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Check if temp file /var/lib/nova/instances/tmpr3hjygw3 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 23 11:13:17 compute-0 nova_compute[187639]: 2026-02-23 11:13:17.254 187643 DEBUG nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr3hjygw3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3bcc6f3-c040-4f37-bc36-53e02f8bda4e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 23 11:13:17 compute-0 nova_compute[187639]: 2026-02-23 11:13:17.303 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:18 compute-0 nova_compute[187639]: 2026-02-23 11:13:18.067 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:19 compute-0 nova_compute[187639]: 2026-02-23 11:13:19.549 187643 DEBUG oslo_concurrency.processutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:13:19 compute-0 nova_compute[187639]: 2026-02-23 11:13:19.635 187643 DEBUG oslo_concurrency.processutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:13:19 compute-0 nova_compute[187639]: 2026-02-23 11:13:19.636 187643 DEBUG oslo_concurrency.processutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:13:19 compute-0 nova_compute[187639]: 2026-02-23 11:13:19.681 187643 DEBUG oslo_concurrency.processutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:13:19 compute-0 sshd-session[214598]: Connection closed by authenticating user root 165.227.79.48 port 59870 [preauth]
Feb 23 11:13:22 compute-0 nova_compute[187639]: 2026-02-23 11:13:22.305 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:22 compute-0 sshd-session[214600]: Accepted publickey for nova from 192.168.122.101 port 42446 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 11:13:22 compute-0 systemd-logind[808]: New session 39 of user nova.
Feb 23 11:13:22 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 11:13:22 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 11:13:22 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 11:13:22 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 11:13:22 compute-0 systemd[214604]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:13:22 compute-0 systemd[214604]: Queued start job for default target Main User Target.
Feb 23 11:13:22 compute-0 systemd[214604]: Created slice User Application Slice.
Feb 23 11:13:22 compute-0 systemd[214604]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:13:22 compute-0 systemd[214604]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 11:13:22 compute-0 systemd[214604]: Reached target Paths.
Feb 23 11:13:22 compute-0 systemd[214604]: Reached target Timers.
Feb 23 11:13:22 compute-0 systemd[214604]: Starting D-Bus User Message Bus Socket...
Feb 23 11:13:22 compute-0 systemd[214604]: Starting Create User's Volatile Files and Directories...
Feb 23 11:13:22 compute-0 systemd[214604]: Listening on D-Bus User Message Bus Socket.
Feb 23 11:13:22 compute-0 systemd[214604]: Reached target Sockets.
Feb 23 11:13:22 compute-0 systemd[214604]: Finished Create User's Volatile Files and Directories.
Feb 23 11:13:22 compute-0 systemd[214604]: Reached target Basic System.
Feb 23 11:13:22 compute-0 systemd[214604]: Reached target Main User Target.
Feb 23 11:13:22 compute-0 systemd[214604]: Startup finished in 137ms.
Feb 23 11:13:22 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 11:13:22 compute-0 systemd[1]: Started Session 39 of User nova.
Feb 23 11:13:22 compute-0 sshd-session[214600]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:13:22 compute-0 sshd-session[214619]: Received disconnect from 192.168.122.101 port 42446:11: disconnected by user
Feb 23 11:13:22 compute-0 sshd-session[214619]: Disconnected from user nova 192.168.122.101 port 42446
Feb 23 11:13:22 compute-0 sshd-session[214600]: pam_unix(sshd:session): session closed for user nova
Feb 23 11:13:22 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Feb 23 11:13:22 compute-0 systemd-logind[808]: Session 39 logged out. Waiting for processes to exit.
Feb 23 11:13:22 compute-0 systemd-logind[808]: Removed session 39.
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.069 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:23.546 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.547 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:23 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:23.548 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.571 187643 DEBUG nova.compute.manager [req-0d83947f-0055-44cd-a524-41777c2670a8 req-51915ee8-448d-4713-b22c-a4f0bdf7f041 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.571 187643 DEBUG oslo_concurrency.lockutils [req-0d83947f-0055-44cd-a524-41777c2670a8 req-51915ee8-448d-4713-b22c-a4f0bdf7f041 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.572 187643 DEBUG oslo_concurrency.lockutils [req-0d83947f-0055-44cd-a524-41777c2670a8 req-51915ee8-448d-4713-b22c-a4f0bdf7f041 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.572 187643 DEBUG oslo_concurrency.lockutils [req-0d83947f-0055-44cd-a524-41777c2670a8 req-51915ee8-448d-4713-b22c-a4f0bdf7f041 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.572 187643 DEBUG nova.compute.manager [req-0d83947f-0055-44cd-a524-41777c2670a8 req-51915ee8-448d-4713-b22c-a4f0bdf7f041 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.572 187643 DEBUG nova.compute.manager [req-0d83947f-0055-44cd-a524-41777c2670a8 req-51915ee8-448d-4713-b22c-a4f0bdf7f041 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:13:23 compute-0 podman[214621]: 2026-02-23 11:13:23.843619788 +0000 UTC m=+0.047773336 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.890 187643 INFO nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Took 4.21 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.891 187643 DEBUG nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.908 187643 DEBUG nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr3hjygw3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3bcc6f3-c040-4f37-bc36-53e02f8bda4e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(7123da2d-5aca-467d-9cf7-6f6489010e70),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.928 187643 DEBUG nova.objects.instance [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid a3bcc6f3-c040-4f37-bc36-53e02f8bda4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.929 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.930 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.931 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.947 187643 DEBUG nova.virt.libvirt.vif [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:12:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1946849368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1946849368',id=19,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:12:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-yrk88hjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:12:18Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=a3bcc6f3-c040-4f37-bc36-53e02f8bda4e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.947 187643 DEBUG nova.network.os_vif_util [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.948 187643 DEBUG nova.network.os_vif_util [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.948 187643 DEBUG nova.virt.libvirt.migration [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updating guest XML with vif config: <interface type="ethernet">
Feb 23 11:13:23 compute-0 nova_compute[187639]:   <mac address="fa:16:3e:b0:86:4a"/>
Feb 23 11:13:23 compute-0 nova_compute[187639]:   <model type="virtio"/>
Feb 23 11:13:23 compute-0 nova_compute[187639]:   <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:13:23 compute-0 nova_compute[187639]:   <mtu size="1442"/>
Feb 23 11:13:23 compute-0 nova_compute[187639]:   <target dev="tap85f079eb-02"/>
Feb 23 11:13:23 compute-0 nova_compute[187639]: </interface>
Feb 23 11:13:23 compute-0 nova_compute[187639]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 23 11:13:23 compute-0 nova_compute[187639]: 2026-02-23 11:13:23.949 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 23 11:13:24 compute-0 nova_compute[187639]: 2026-02-23 11:13:24.433 187643 DEBUG nova.virt.libvirt.migration [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:13:24 compute-0 nova_compute[187639]: 2026-02-23 11:13:24.434 187643 INFO nova.virt.libvirt.migration [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 23 11:13:24 compute-0 nova_compute[187639]: 2026-02-23 11:13:24.493 187643 INFO nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 23 11:13:24 compute-0 nova_compute[187639]: 2026-02-23 11:13:24.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:24 compute-0 nova_compute[187639]: 2026-02-23 11:13:24.996 187643 DEBUG nova.virt.libvirt.migration [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:13:24 compute-0 nova_compute[187639]: 2026-02-23 11:13:24.997 187643 DEBUG nova.virt.libvirt.migration [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.502 187643 DEBUG nova.virt.libvirt.migration [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.503 187643 DEBUG nova.virt.libvirt.migration [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.629 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845205.6294322, a3bcc6f3-c040-4f37-bc36-53e02f8bda4e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.630 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] VM Paused (Lifecycle Event)
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.652 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.657 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.679 187643 DEBUG nova.compute.manager [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.680 187643 DEBUG oslo_concurrency.lockutils [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.680 187643 DEBUG oslo_concurrency.lockutils [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.681 187643 DEBUG oslo_concurrency.lockutils [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.681 187643 DEBUG nova.compute.manager [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.682 187643 WARNING nova.compute.manager [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received unexpected event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with vm_state active and task_state migrating.
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.682 187643 DEBUG nova.compute.manager [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-changed-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.683 187643 DEBUG nova.compute.manager [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Refreshing instance network info cache due to event network-changed-85f079eb-024d-4372-ab68-03414c9d3302. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.683 187643 DEBUG oslo_concurrency.lockutils [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.684 187643 DEBUG oslo_concurrency.lockutils [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.684 187643 DEBUG nova.network.neutron [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Refreshing network info cache for port 85f079eb-024d-4372-ab68-03414c9d3302 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.688 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.708 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.709 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 11:13:25 compute-0 kernel: tap85f079eb-02 (unregistering): left promiscuous mode
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.758 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:25 compute-0 NetworkManager[57207]: <info>  [1771845205.7598] device (tap85f079eb-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:13:25 compute-0 ovn_controller[97601]: 2026-02-23T11:13:25Z|00158|binding|INFO|Releasing lport 85f079eb-024d-4372-ab68-03414c9d3302 from this chassis (sb_readonly=0)
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.767 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:25 compute-0 ovn_controller[97601]: 2026-02-23T11:13:25Z|00159|binding|INFO|Setting lport 85f079eb-024d-4372-ab68-03414c9d3302 down in Southbound
Feb 23 11:13:25 compute-0 ovn_controller[97601]: 2026-02-23T11:13:25Z|00160|binding|INFO|Removing iface tap85f079eb-02 ovn-installed in OVS
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.769 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.777 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:86:4a 10.100.0.9'], port_security=['fa:16:3e:b0:86:4a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '48738a31-ba59-4fc8-acf1-d1f474e97648'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a3bcc6f3-c040-4f37-bc36-53e02f8bda4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=85f079eb-024d-4372-ab68-03414c9d3302) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.778 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 85f079eb-024d-4372-ab68-03414c9d3302 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.778 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.780 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.782 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1ffdbe-16be-4c14-9126-a97b38e134b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.782 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace which is not needed anymore
Feb 23 11:13:25 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Feb 23 11:13:25 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 13.727s CPU time.
Feb 23 11:13:25 compute-0 systemd-machined[156970]: Machine qemu-14-instance-00000013 terminated.
Feb 23 11:13:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [NOTICE]   (214365) : haproxy version is 2.8.14-c23fe91
Feb 23 11:13:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [NOTICE]   (214365) : path to executable is /usr/sbin/haproxy
Feb 23 11:13:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [WARNING]  (214365) : Exiting Master process...
Feb 23 11:13:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [WARNING]  (214365) : Exiting Master process...
Feb 23 11:13:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [ALERT]    (214365) : Current worker (214367) exited with code 143 (Terminated)
Feb 23 11:13:25 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[214361]: [WARNING]  (214365) : All workers exited. Exiting... (0)
Feb 23 11:13:25 compute-0 systemd[1]: libpod-d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0.scope: Deactivated successfully.
Feb 23 11:13:25 compute-0 podman[214686]: 2026-02-23 11:13:25.905739547 +0000 UTC m=+0.048455104 container died d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 11:13:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0-userdata-shm.mount: Deactivated successfully.
Feb 23 11:13:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-f4964b78125e4f2f795ce896e7112eeced38cff87fa01fee56674de9b93719d9-merged.mount: Deactivated successfully.
Feb 23 11:13:25 compute-0 podman[214686]: 2026-02-23 11:13:25.935480288 +0000 UTC m=+0.078195845 container cleanup d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:13:25 compute-0 systemd[1]: libpod-conmon-d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0.scope: Deactivated successfully.
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.976 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.976 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 23 11:13:25 compute-0 nova_compute[187639]: 2026-02-23 11:13:25.976 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 23 11:13:25 compute-0 podman[214719]: 2026-02-23 11:13:25.992018693 +0000 UTC m=+0.039246152 container remove d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.994 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[262b852d-7110-4984-a834-8677ed689bf0]: (4, ('Mon Feb 23 11:13:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0)\nd60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0\nMon Feb 23 11:13:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (d60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0)\nd60cd5eb17d9ea5d4c4893e255035fb4629524f7c4fb665d30601927b677eaf0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.996 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[33192072-d84d-432e-b899-42787f654a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:25.997 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.000 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:26 compute-0 kernel: tap4b12da8d-30: left promiscuous mode
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.004 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.006 187643 DEBUG nova.virt.libvirt.guest [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'a3bcc6f3-c040-4f37-bc36-53e02f8bda4e' (instance-00000013) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.006 187643 INFO nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Migration operation has completed
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.006 187643 INFO nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] _post_live_migration() is started..
Feb 23 11:13:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:26.010 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b61004a8-30b7-44b4-ba60-74d7d5677906]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.012 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:26.026 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea2bd8c-727c-426c-ad84-172637661aca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:26.027 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d1065b48-7710-48af-916e-80988d845fd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:26.041 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0fcf54-1827-4a3c-be69-6d7a9e84d9eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428210, 'reachable_time': 38417, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214751, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:26.044 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:13:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b12da8d\x2d3150\x2d4d44\x2db948\x2d8d49ddadedef.mount: Deactivated successfully.
Feb 23 11:13:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:26.044 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[5210ea85-d8f7-4231-bb68-c30bc3a9d379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.612 187643 DEBUG nova.compute.manager [req-3d99e5fa-156c-41f5-a51e-5cb771cbc3c2 req-88e2e2df-5a1e-4456-ba99-3ca969a0832e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.613 187643 DEBUG oslo_concurrency.lockutils [req-3d99e5fa-156c-41f5-a51e-5cb771cbc3c2 req-88e2e2df-5a1e-4456-ba99-3ca969a0832e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.613 187643 DEBUG oslo_concurrency.lockutils [req-3d99e5fa-156c-41f5-a51e-5cb771cbc3c2 req-88e2e2df-5a1e-4456-ba99-3ca969a0832e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.614 187643 DEBUG oslo_concurrency.lockutils [req-3d99e5fa-156c-41f5-a51e-5cb771cbc3c2 req-88e2e2df-5a1e-4456-ba99-3ca969a0832e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.614 187643 DEBUG nova.compute.manager [req-3d99e5fa-156c-41f5-a51e-5cb771cbc3c2 req-88e2e2df-5a1e-4456-ba99-3ca969a0832e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.615 187643 DEBUG nova.compute.manager [req-3d99e5fa-156c-41f5-a51e-5cb771cbc3c2 req-88e2e2df-5a1e-4456-ba99-3ca969a0832e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.700 187643 DEBUG nova.network.neutron [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Activated binding for port 85f079eb-024d-4372-ab68-03414c9d3302 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.700 187643 DEBUG nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.702 187643 DEBUG nova.virt.libvirt.vif [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:12:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1946849368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1946849368',id=19,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:12:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-yrk88hjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:13:15Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=a3bcc6f3-c040-4f37-bc36-53e02f8bda4e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.702 187643 DEBUG nova.network.os_vif_util [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.704 187643 DEBUG nova.network.os_vif_util [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.704 187643 DEBUG os_vif [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.707 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.707 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f079eb-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.750 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.752 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.756 187643 INFO os_vif [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:86:4a,bridge_name='br-int',has_traffic_filtering=True,id=85f079eb-024d-4372-ab68-03414c9d3302,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f079eb-02')
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.756 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.757 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.757 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.758 187643 DEBUG nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.758 187643 INFO nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Deleting instance files /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e_del
Feb 23 11:13:26 compute-0 nova_compute[187639]: 2026-02-23 11:13:26.759 187643 INFO nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Deletion of /var/lib/nova/instances/a3bcc6f3-c040-4f37-bc36-53e02f8bda4e_del complete
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.189 187643 DEBUG nova.network.neutron [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updated VIF entry in instance network info cache for port 85f079eb-024d-4372-ab68-03414c9d3302. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.190 187643 DEBUG nova.network.neutron [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Updating instance_info_cache with network_info: [{"id": "85f079eb-024d-4372-ab68-03414c9d3302", "address": "fa:16:3e:b0:86:4a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f079eb-02", "ovs_interfaceid": "85f079eb-024d-4372-ab68-03414c9d3302", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.215 187643 DEBUG oslo_concurrency.lockutils [req-09aeba33-e5c5-4f6e-9bac-b8f3db0eb960 req-8c2cc1a3-0393-4828-a7f0-66fbc8ef22b9 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-a3bcc6f3-c040-4f37-bc36-53e02f8bda4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.804 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.804 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.804 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.805 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.805 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.805 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-unplugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.805 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.806 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.806 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.806 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.806 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.807 187643 WARNING nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received unexpected event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with vm_state active and task_state migrating.
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.807 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.807 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.808 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.808 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.808 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.808 187643 WARNING nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received unexpected event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with vm_state active and task_state migrating.
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.809 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.809 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.809 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.809 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.810 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.810 187643 WARNING nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received unexpected event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with vm_state active and task_state migrating.
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.810 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.810 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.811 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.811 187643 DEBUG oslo_concurrency.lockutils [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.811 187643 DEBUG nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] No waiting events found dispatching network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:13:27 compute-0 nova_compute[187639]: 2026-02-23 11:13:27.811 187643 WARNING nova.compute.manager [req-f84bd4f6-b5fa-4664-a2bd-38e3aabdbef5 req-090f3e74-f6f3-43c9-b41a-d6d3869e40b8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Received unexpected event network-vif-plugged-85f079eb-024d-4372-ab68-03414c9d3302 for instance with vm_state active and task_state migrating.
Feb 23 11:13:28 compute-0 nova_compute[187639]: 2026-02-23 11:13:28.109 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:29 compute-0 podman[197002]: time="2026-02-23T11:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:13:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:13:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 23 11:13:29 compute-0 podman[214752]: 2026-02-23 11:13:29.842797033 +0000 UTC m=+0.049535282 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 11:13:31 compute-0 openstack_network_exporter[199919]: ERROR   11:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:13:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:13:31 compute-0 openstack_network_exporter[199919]: ERROR   11:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:13:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:13:31 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:13:31.552 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:13:31 compute-0 nova_compute[187639]: 2026-02-23 11:13:31.801 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:32 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 11:13:32 compute-0 systemd[214604]: Activating special unit Exit the Session...
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped target Main User Target.
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped target Basic System.
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped target Paths.
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped target Sockets.
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped target Timers.
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 11:13:32 compute-0 systemd[214604]: Closed D-Bus User Message Bus Socket.
Feb 23 11:13:32 compute-0 systemd[214604]: Stopped Create User's Volatile Files and Directories.
Feb 23 11:13:32 compute-0 systemd[214604]: Removed slice User Application Slice.
Feb 23 11:13:32 compute-0 systemd[214604]: Reached target Shutdown.
Feb 23 11:13:32 compute-0 systemd[214604]: Finished Exit the Session.
Feb 23 11:13:32 compute-0 systemd[214604]: Reached target Exit the Session.
Feb 23 11:13:32 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 11:13:32 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 11:13:32 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 11:13:32 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 11:13:32 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 11:13:32 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 11:13:32 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 11:13:33 compute-0 nova_compute[187639]: 2026-02-23 11:13:33.111 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.432 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.433 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.433 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "a3bcc6f3-c040-4f37-bc36-53e02f8bda4e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.458 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.459 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.459 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.459 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.573 187643 WARNING nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.574 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5765MB free_disk=73.2048454284668GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.575 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.575 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.622 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration for instance a3bcc6f3-c040-4f37-bc36-53e02f8bda4e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.646 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.679 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration 7123da2d-5aca-467d-9cf7-6f6489010e70 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.680 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.680 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.816 187643 DEBUG nova.compute.provider_tree [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.851 187643 DEBUG nova.scheduler.client.report [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.877 187643 DEBUG nova.compute.resource_tracker [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.878 187643 DEBUG oslo_concurrency.lockutils [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.882 187643 INFO nova.compute.manager [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.989 187643 INFO nova.scheduler.client.report [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Deleted allocation for migration 7123da2d-5aca-467d-9cf7-6f6489010e70
Feb 23 11:13:35 compute-0 nova_compute[187639]: 2026-02-23 11:13:35.990 187643 DEBUG nova.virt.libvirt.driver [None req-a8475439-6eb0-47f0-aa2b-015018b7659f a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 23 11:13:36 compute-0 nova_compute[187639]: 2026-02-23 11:13:36.821 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:36 compute-0 podman[214773]: 2026-02-23 11:13:36.892341345 +0000 UTC m=+0.087070578 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Feb 23 11:13:38 compute-0 nova_compute[187639]: 2026-02-23 11:13:38.113 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:40 compute-0 podman[214799]: 2026-02-23 11:13:40.881667126 +0000 UTC m=+0.082164769 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1770267347, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 11:13:40 compute-0 nova_compute[187639]: 2026-02-23 11:13:40.975 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845205.973963, a3bcc6f3-c040-4f37-bc36-53e02f8bda4e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:13:40 compute-0 nova_compute[187639]: 2026-02-23 11:13:40.975 187643 INFO nova.compute.manager [-] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] VM Stopped (Lifecycle Event)
Feb 23 11:13:41 compute-0 nova_compute[187639]: 2026-02-23 11:13:41.011 187643 DEBUG nova.compute.manager [None req-1d286a4a-4d59-4613-a62e-17d7acb4fd1b - - - - - -] [instance: a3bcc6f3-c040-4f37-bc36-53e02f8bda4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:13:41 compute-0 sshd-session[214820]: Invalid user admin from 143.198.30.3 port 42088
Feb 23 11:13:41 compute-0 sshd-session[214820]: Connection closed by invalid user admin 143.198.30.3 port 42088 [preauth]
Feb 23 11:13:41 compute-0 nova_compute[187639]: 2026-02-23 11:13:41.862 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:43 compute-0 nova_compute[187639]: 2026-02-23 11:13:43.116 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:46 compute-0 nova_compute[187639]: 2026-02-23 11:13:46.902 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:48 compute-0 nova_compute[187639]: 2026-02-23 11:13:48.148 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:51 compute-0 nova_compute[187639]: 2026-02-23 11:13:51.959 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:53 compute-0 nova_compute[187639]: 2026-02-23 11:13:53.150 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:54 compute-0 podman[214822]: 2026-02-23 11:13:54.841480409 +0000 UTC m=+0.047707844 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:13:56 compute-0 nova_compute[187639]: 2026-02-23 11:13:56.961 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:58 compute-0 nova_compute[187639]: 2026-02-23 11:13:58.182 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:13:59 compute-0 podman[197002]: time="2026-02-23T11:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:13:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:13:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 23 11:14:00 compute-0 podman[214848]: 2026-02-23 11:14:00.838379266 +0000 UTC m=+0.041101461 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:14:01 compute-0 openstack_network_exporter[199919]: ERROR   11:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:14:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:14:01 compute-0 openstack_network_exporter[199919]: ERROR   11:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:14:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:14:01 compute-0 nova_compute[187639]: 2026-02-23 11:14:01.994 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:03 compute-0 nova_compute[187639]: 2026-02-23 11:14:03.216 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:03 compute-0 sshd-session[214868]: Invalid user admin from 165.227.79.48 port 53566
Feb 23 11:14:03 compute-0 sshd-session[214868]: Connection closed by invalid user admin 165.227.79.48 port 53566 [preauth]
Feb 23 11:14:03 compute-0 nova_compute[187639]: 2026-02-23 11:14:03.721 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.706 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.706 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.706 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.706 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.706 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:14:06 compute-0 nova_compute[187639]: 2026-02-23 11:14:06.996 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:07 compute-0 podman[214870]: 2026-02-23 11:14:07.877325397 +0000 UTC m=+0.085255180 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:14:08 compute-0 nova_compute[187639]: 2026-02-23 11:14:08.218 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:09 compute-0 nova_compute[187639]: 2026-02-23 11:14:09.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.726 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.727 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.727 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.728 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:14:11 compute-0 podman[214898]: 2026-02-23 11:14:11.877719509 +0000 UTC m=+0.070854483 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.890 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.891 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5789MB free_disk=73.20491409301758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.891 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.891 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.958 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.959 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:14:11 compute-0 ovn_controller[97601]: 2026-02-23T11:14:11Z|00161|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 23 11:14:11 compute-0 nova_compute[187639]: 2026-02-23 11:14:11.989 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:14:12 compute-0 nova_compute[187639]: 2026-02-23 11:14:12.004 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:14:12 compute-0 nova_compute[187639]: 2026-02-23 11:14:12.006 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:14:12 compute-0 nova_compute[187639]: 2026-02-23 11:14:12.006 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:12 compute-0 nova_compute[187639]: 2026-02-23 11:14:12.027 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:12.659 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:12.659 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:12.660 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:13 compute-0 nova_compute[187639]: 2026-02-23 11:14:13.254 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:13 compute-0 sshd-session[214918]: Invalid user admin from 143.198.30.3 port 38298
Feb 23 11:14:13 compute-0 sshd-session[214918]: Connection closed by invalid user admin 143.198.30.3 port 38298 [preauth]
Feb 23 11:14:15 compute-0 nova_compute[187639]: 2026-02-23 11:14:15.166 187643 DEBUG nova.compute.manager [None req-a6f9cc74-60b2-4155-953b-5971e78f852a d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 23 11:14:15 compute-0 nova_compute[187639]: 2026-02-23 11:14:15.316 187643 DEBUG nova.compute.provider_tree [None req-a6f9cc74-60b2-4155-953b-5971e78f852a d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 28 to 31 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 11:14:16 compute-0 nova_compute[187639]: 2026-02-23 11:14:16.006 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:17 compute-0 nova_compute[187639]: 2026-02-23 11:14:17.028 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:18 compute-0 nova_compute[187639]: 2026-02-23 11:14:18.280 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:22 compute-0 nova_compute[187639]: 2026-02-23 11:14:22.075 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:23 compute-0 nova_compute[187639]: 2026-02-23 11:14:23.321 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:23 compute-0 nova_compute[187639]: 2026-02-23 11:14:23.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:14:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:25.395 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:14:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:25.395 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:14:25 compute-0 nova_compute[187639]: 2026-02-23 11:14:25.396 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:25 compute-0 podman[214920]: 2026-02-23 11:14:25.8496178 +0000 UTC m=+0.049366738 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:14:27 compute-0 nova_compute[187639]: 2026-02-23 11:14:27.078 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:28 compute-0 nova_compute[187639]: 2026-02-23 11:14:28.330 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:29 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:29.397 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:29 compute-0 podman[197002]: time="2026-02-23T11:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:14:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:14:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 23 11:14:31 compute-0 openstack_network_exporter[199919]: ERROR   11:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:14:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:14:31 compute-0 openstack_network_exporter[199919]: ERROR   11:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:14:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:14:31 compute-0 podman[214944]: 2026-02-23 11:14:31.481353541 +0000 UTC m=+0.041255004 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent)
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.107 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.843 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.843 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.865 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.957 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.957 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.963 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:14:32 compute-0 nova_compute[187639]: 2026-02-23 11:14:32.964 187643 INFO nova.compute.claims [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.090 187643 DEBUG nova.compute.provider_tree [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.105 187643 DEBUG nova.scheduler.client.report [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.129 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.130 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.186 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.187 187643 DEBUG nova.network.neutron [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.208 187643 INFO nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.227 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.331 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.386 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.388 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.389 187643 INFO nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Creating image(s)
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.390 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "/var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.390 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.391 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.406 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.446 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.447 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.448 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.463 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.504 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.505 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.546 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.547 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.548 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.590 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.591 187643 DEBUG nova.virt.disk.api [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Checking if we can resize image /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.592 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.635 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.636 187643 DEBUG nova.virt.disk.api [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Cannot resize image /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.637 187643 DEBUG nova.objects.instance [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'migration_context' on Instance uuid 7304eee8-2c91-4686-b99a-d3732086baaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.656 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.657 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Ensure instance console log exists: /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.658 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.658 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:33 compute-0 nova_compute[187639]: 2026-02-23 11:14:33.659 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:34 compute-0 nova_compute[187639]: 2026-02-23 11:14:34.443 187643 DEBUG nova.policy [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48814d91aad6418f9d55fc9967ed0087', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:14:35 compute-0 nova_compute[187639]: 2026-02-23 11:14:35.758 187643 DEBUG nova.network.neutron [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Successfully created port: bad61a3a-6f81-430b-8c05-446e7f369295 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.349 187643 DEBUG nova.network.neutron [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Successfully updated port: bad61a3a-6f81-430b-8c05-446e7f369295 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.370 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "refresh_cache-7304eee8-2c91-4686-b99a-d3732086baaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.371 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquired lock "refresh_cache-7304eee8-2c91-4686-b99a-d3732086baaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.371 187643 DEBUG nova.network.neutron [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.464 187643 DEBUG nova.compute.manager [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received event network-changed-bad61a3a-6f81-430b-8c05-446e7f369295 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.465 187643 DEBUG nova.compute.manager [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Refreshing instance network info cache due to event network-changed-bad61a3a-6f81-430b-8c05-446e7f369295. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.465 187643 DEBUG oslo_concurrency.lockutils [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-7304eee8-2c91-4686-b99a-d3732086baaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:14:36 compute-0 nova_compute[187639]: 2026-02-23 11:14:36.517 187643 DEBUG nova.network.neutron [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:14:37 compute-0 nova_compute[187639]: 2026-02-23 11:14:37.171 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.371 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.562 187643 DEBUG nova.network.neutron [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Updating instance_info_cache with network_info: [{"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.592 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Releasing lock "refresh_cache-7304eee8-2c91-4686-b99a-d3732086baaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.593 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Instance network_info: |[{"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.593 187643 DEBUG oslo_concurrency.lockutils [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-7304eee8-2c91-4686-b99a-d3732086baaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.593 187643 DEBUG nova.network.neutron [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Refreshing network info cache for port bad61a3a-6f81-430b-8c05-446e7f369295 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.597 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Start _get_guest_xml network_info=[{"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.601 187643 WARNING nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.606 187643 DEBUG nova.virt.libvirt.host [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.607 187643 DEBUG nova.virt.libvirt.host [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.610 187643 DEBUG nova.virt.libvirt.host [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.610 187643 DEBUG nova.virt.libvirt.host [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.612 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.612 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.612 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.613 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.613 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.613 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.614 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.614 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.614 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.614 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.615 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.615 187643 DEBUG nova.virt.hardware [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.619 187643 DEBUG nova.virt.libvirt.vif [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:14:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1703562665',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1703562665',id=22,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-ah98ec5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:14:33Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=7304eee8-2c91-4686-b99a-d3732086baaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.620 187643 DEBUG nova.network.os_vif_util [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.620 187643 DEBUG nova.network.os_vif_util [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f4:6d,bridge_name='br-int',has_traffic_filtering=True,id=bad61a3a-6f81-430b-8c05-446e7f369295,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad61a3a-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.621 187643 DEBUG nova.objects.instance [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 7304eee8-2c91-4686-b99a-d3732086baaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.643 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <uuid>7304eee8-2c91-4686-b99a-d3732086baaf</uuid>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <name>instance-00000016</name>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteStrategies-server-1703562665</nova:name>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:14:38</nova:creationTime>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:user uuid="48814d91aad6418f9d55fc9967ed0087">tempest-TestExecuteStrategies-126537390-project-member</nova:user>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:project uuid="5dfbb0ac693b4065ada17052ebb303dd">tempest-TestExecuteStrategies-126537390</nova:project>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         <nova:port uuid="bad61a3a-6f81-430b-8c05-446e7f369295">
Feb 23 11:14:38 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <system>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <entry name="serial">7304eee8-2c91-4686-b99a-d3732086baaf</entry>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <entry name="uuid">7304eee8-2c91-4686-b99a-d3732086baaf</entry>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </system>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <os>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   </os>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <features>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   </features>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk.config"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:9b:f4:6d"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <target dev="tapbad61a3a-6f"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/console.log" append="off"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <video>
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </video>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:14:38 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:14:38 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:14:38 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:14:38 compute-0 nova_compute[187639]: </domain>
Feb 23 11:14:38 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.645 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Preparing to wait for external event network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.645 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.645 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.645 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.646 187643 DEBUG nova.virt.libvirt.vif [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:14:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1703562665',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1703562665',id=22,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-ah98ec5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:14:33Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=7304eee8-2c91-4686-b99a-d3732086baaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.646 187643 DEBUG nova.network.os_vif_util [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.647 187643 DEBUG nova.network.os_vif_util [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f4:6d,bridge_name='br-int',has_traffic_filtering=True,id=bad61a3a-6f81-430b-8c05-446e7f369295,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad61a3a-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.647 187643 DEBUG os_vif [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f4:6d,bridge_name='br-int',has_traffic_filtering=True,id=bad61a3a-6f81-430b-8c05-446e7f369295,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad61a3a-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.648 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.648 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.649 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.651 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.651 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbad61a3a-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.652 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbad61a3a-6f, col_values=(('external_ids', {'iface-id': 'bad61a3a-6f81-430b-8c05-446e7f369295', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:f4:6d', 'vm-uuid': '7304eee8-2c91-4686-b99a-d3732086baaf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.654 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:38 compute-0 NetworkManager[57207]: <info>  [1771845278.6556] manager: (tapbad61a3a-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.657 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.660 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.661 187643 INFO os_vif [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f4:6d,bridge_name='br-int',has_traffic_filtering=True,id=bad61a3a-6f81-430b-8c05-446e7f369295,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad61a3a-6f')
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.707 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.707 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.708 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No VIF found with MAC fa:16:3e:9b:f4:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:14:38 compute-0 nova_compute[187639]: 2026-02-23 11:14:38.709 187643 INFO nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Using config drive
Feb 23 11:14:38 compute-0 podman[214980]: 2026-02-23 11:14:38.874859429 +0000 UTC m=+0.073253596 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.561 187643 INFO nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Creating config drive at /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk.config
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.569 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp0_hy2ih_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.687 187643 DEBUG oslo_concurrency.processutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp0_hy2ih_" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:39 compute-0 kernel: tapbad61a3a-6f: entered promiscuous mode
Feb 23 11:14:39 compute-0 NetworkManager[57207]: <info>  [1771845279.7503] manager: (tapbad61a3a-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Feb 23 11:14:39 compute-0 ovn_controller[97601]: 2026-02-23T11:14:39Z|00162|binding|INFO|Claiming lport bad61a3a-6f81-430b-8c05-446e7f369295 for this chassis.
Feb 23 11:14:39 compute-0 ovn_controller[97601]: 2026-02-23T11:14:39Z|00163|binding|INFO|bad61a3a-6f81-430b-8c05-446e7f369295: Claiming fa:16:3e:9b:f4:6d 10.100.0.6
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.751 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.761 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:39 compute-0 ovn_controller[97601]: 2026-02-23T11:14:39Z|00164|binding|INFO|Setting lport bad61a3a-6f81-430b-8c05-446e7f369295 ovn-installed in OVS
Feb 23 11:14:39 compute-0 ovn_controller[97601]: 2026-02-23T11:14:39Z|00165|binding|INFO|Setting lport bad61a3a-6f81-430b-8c05-446e7f369295 up in Southbound
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.762 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:f4:6d 10.100.0.6'], port_security=['fa:16:3e:9b:f4:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7304eee8-2c91-4686-b99a-d3732086baaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=bad61a3a-6f81-430b-8c05-446e7f369295) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.765 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.765 106968 INFO neutron.agent.ovn.metadata.agent [-] Port bad61a3a-6f81-430b-8c05-446e7f369295 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.768 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.768 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:39 compute-0 systemd-udevd[215025]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.778 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[27621ad8-cedd-48b7-9b89-42cc5bcc7934]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.779 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b12da8d-31 in ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:14:39 compute-0 systemd-machined[156970]: New machine qemu-15-instance-00000016.
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.782 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b12da8d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.782 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4b84dd-ff5a-45e4-b7d8-78ac4bfe96c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.783 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[7741f8b7-035d-4d5b-afb6-6b26c0af6b80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 NetworkManager[57207]: <info>  [1771845279.7887] device (tapbad61a3a-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:14:39 compute-0 NetworkManager[57207]: <info>  [1771845279.7895] device (tapbad61a3a-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.792 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bcf208-643c-45bb-8fc8-5f9c51bf1791]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000016.
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.813 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b34175c0-f515-429f-ab64-c991ffe30efd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.833 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[0d3bbc2f-c669-461a-a997-2ac3e5307702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 NetworkManager[57207]: <info>  [1771845279.8374] manager: (tap4b12da8d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.837 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0bea36-00bf-4969-a699-98028b089401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.855 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[f7b85430-d7a7-4438-a6c6-f615712cb6a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.857 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[575b9c60-0183-4154-9eb6-f7edde37aecb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 NetworkManager[57207]: <info>  [1771845279.8743] device (tap4b12da8d-30): carrier: link connected
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.876 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[47442258-688e-47de-8820-d581b33f2550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.888 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f50d0dfe-ff45-4590-9252-802b97071cfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442404, 'reachable_time': 33170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215058, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.904 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b11fc827-5ffc-4d2b-9c7e-13fec2c08d10]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8a8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442404, 'tstamp': 442404}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215059, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.916 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[69aaa0c2-85ab-4a19-90c4-ad3708056210]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442404, 'reachable_time': 33170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215060, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.942 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a69a998e-156b-4ccb-b8e6-d1a5aae84aa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.985 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f42656bc-e52b-4546-a3fb-e46af20d45a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.987 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.987 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.988 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.991 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:39 compute-0 kernel: tap4b12da8d-30: entered promiscuous mode
Feb 23 11:14:39 compute-0 NetworkManager[57207]: <info>  [1771845279.9927] manager: (tap4b12da8d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.994 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:39 compute-0 ovn_controller[97601]: 2026-02-23T11:14:39Z|00166|binding|INFO|Releasing lport 586378da-906d-4768-bab7-0954450c4a57 from this chassis (sb_readonly=0)
Feb 23 11:14:39 compute-0 nova_compute[187639]: 2026-02-23 11:14:39.995 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:39 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.996 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:39.999 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[edffc468-6c31-4ebe-8622-9be6be82e8a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.000 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:40.000 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:14:40 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:40.001 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'env', 'PROCESS_TAG=haproxy-4b12da8d-3150-4d44-b948-8d49ddadedef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b12da8d-3150-4d44-b948-8d49ddadedef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:14:40 compute-0 podman[215092]: 2026-02-23 11:14:40.333108252 +0000 UTC m=+0.047147670 container create ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:14:40 compute-0 systemd[1]: Started libpod-conmon-ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12.scope.
Feb 23 11:14:40 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:14:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72267b0a2107b2d21e67cc6226c19e88b9368910982d96bf1d1d17f65274659b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:14:40 compute-0 podman[215092]: 2026-02-23 11:14:40.307484559 +0000 UTC m=+0.021524067 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:14:40 compute-0 podman[215092]: 2026-02-23 11:14:40.410932826 +0000 UTC m=+0.124972284 container init ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:14:40 compute-0 podman[215092]: 2026-02-23 11:14:40.415110536 +0000 UTC m=+0.129149964 container start ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 11:14:40 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215107]: [NOTICE]   (215111) : New worker (215113) forked
Feb 23 11:14:40 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215107]: [NOTICE]   (215111) : Loading success.
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.539 187643 DEBUG nova.compute.manager [req-ca720d12-fadb-48b7-9bfc-508dab1beaba req-8efecfa0-5dbd-4aba-8ed9-86159989d275 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received event network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.540 187643 DEBUG oslo_concurrency.lockutils [req-ca720d12-fadb-48b7-9bfc-508dab1beaba req-8efecfa0-5dbd-4aba-8ed9-86159989d275 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.540 187643 DEBUG oslo_concurrency.lockutils [req-ca720d12-fadb-48b7-9bfc-508dab1beaba req-8efecfa0-5dbd-4aba-8ed9-86159989d275 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.540 187643 DEBUG oslo_concurrency.lockutils [req-ca720d12-fadb-48b7-9bfc-508dab1beaba req-8efecfa0-5dbd-4aba-8ed9-86159989d275 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.541 187643 DEBUG nova.compute.manager [req-ca720d12-fadb-48b7-9bfc-508dab1beaba req-8efecfa0-5dbd-4aba-8ed9-86159989d275 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Processing event network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.953 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845280.95336, 7304eee8-2c91-4686-b99a-d3732086baaf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.954 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] VM Started (Lifecycle Event)
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.956 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.959 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.962 187643 INFO nova.virt.libvirt.driver [-] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Instance spawned successfully.
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.962 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.978 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.983 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.985 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.985 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.986 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.986 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.986 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:14:40 compute-0 nova_compute[187639]: 2026-02-23 11:14:40.987 187643 DEBUG nova.virt.libvirt.driver [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.008 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.008 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845280.9535666, 7304eee8-2c91-4686-b99a-d3732086baaf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.009 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] VM Paused (Lifecycle Event)
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.041 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.044 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845280.9589021, 7304eee8-2c91-4686-b99a-d3732086baaf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.044 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] VM Resumed (Lifecycle Event)
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.050 187643 INFO nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Took 7.66 seconds to spawn the instance on the hypervisor.
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.050 187643 DEBUG nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.061 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.063 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.090 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.112 187643 INFO nova.compute.manager [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Took 8.18 seconds to build instance.
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.126 187643 DEBUG oslo_concurrency.lockutils [None req-2c42d856-a9ca-4c97-a7bf-8a205cf7a858 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.500 187643 DEBUG nova.network.neutron [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Updated VIF entry in instance network info cache for port bad61a3a-6f81-430b-8c05-446e7f369295. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.501 187643 DEBUG nova.network.neutron [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Updating instance_info_cache with network_info: [{"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:14:41 compute-0 nova_compute[187639]: 2026-02-23 11:14:41.523 187643 DEBUG oslo_concurrency.lockutils [req-3cc57409-bf72-41b5-8b57-2f70d0a0b651 req-ca7ea379-54d3-4616-bed1-205cebf109d2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-7304eee8-2c91-4686-b99a-d3732086baaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:14:42 compute-0 nova_compute[187639]: 2026-02-23 11:14:42.695 187643 DEBUG nova.compute.manager [req-5ff08838-caa3-4fe1-8abd-b9377fe0e84d req-5faa3a5c-7db6-4764-968f-2136191521c2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received event network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:14:42 compute-0 nova_compute[187639]: 2026-02-23 11:14:42.696 187643 DEBUG oslo_concurrency.lockutils [req-5ff08838-caa3-4fe1-8abd-b9377fe0e84d req-5faa3a5c-7db6-4764-968f-2136191521c2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:42 compute-0 nova_compute[187639]: 2026-02-23 11:14:42.696 187643 DEBUG oslo_concurrency.lockutils [req-5ff08838-caa3-4fe1-8abd-b9377fe0e84d req-5faa3a5c-7db6-4764-968f-2136191521c2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:42 compute-0 nova_compute[187639]: 2026-02-23 11:14:42.696 187643 DEBUG oslo_concurrency.lockutils [req-5ff08838-caa3-4fe1-8abd-b9377fe0e84d req-5faa3a5c-7db6-4764-968f-2136191521c2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:42 compute-0 nova_compute[187639]: 2026-02-23 11:14:42.696 187643 DEBUG nova.compute.manager [req-5ff08838-caa3-4fe1-8abd-b9377fe0e84d req-5faa3a5c-7db6-4764-968f-2136191521c2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] No waiting events found dispatching network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:14:42 compute-0 nova_compute[187639]: 2026-02-23 11:14:42.697 187643 WARNING nova.compute.manager [req-5ff08838-caa3-4fe1-8abd-b9377fe0e84d req-5faa3a5c-7db6-4764-968f-2136191521c2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received unexpected event network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 for instance with vm_state active and task_state None.
Feb 23 11:14:42 compute-0 podman[215129]: 2026-02-23 11:14:42.843423404 +0000 UTC m=+0.051216936 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, release=1770267347, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 11:14:43 compute-0 sshd-session[215152]: Invalid user admin from 143.198.30.3 port 53248
Feb 23 11:14:43 compute-0 sshd-session[215152]: Connection closed by invalid user admin 143.198.30.3 port 53248 [preauth]
Feb 23 11:14:43 compute-0 nova_compute[187639]: 2026-02-23 11:14:43.394 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:43 compute-0 nova_compute[187639]: 2026-02-23 11:14:43.655 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:47 compute-0 sshd-session[215154]: Invalid user admin from 165.227.79.48 port 47318
Feb 23 11:14:47 compute-0 sshd-session[215154]: Connection closed by invalid user admin 165.227.79.48 port 47318 [preauth]
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.172 187643 DEBUG nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Creating tmpfile /var/lib/nova/instances/tmpqaislyx1 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.173 187643 DEBUG nova.compute.manager [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqaislyx1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.430 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.656 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.768 187643 DEBUG nova.compute.manager [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqaislyx1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0182ac12-7457-445c-9cb5-5701fd565168',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.798 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "refresh_cache-0182ac12-7457-445c-9cb5-5701fd565168" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.799 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquired lock "refresh_cache-0182ac12-7457-445c-9cb5-5701fd565168" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:14:48 compute-0 nova_compute[187639]: 2026-02-23 11:14:48.799 187643 DEBUG nova.network.neutron [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.066 187643 DEBUG nova.network.neutron [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Updating instance_info_cache with network_info: [{"id": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "address": "fa:16:3e:fc:57:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a8a948c-6d", "ovs_interfaceid": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.089 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Releasing lock "refresh_cache-0182ac12-7457-445c-9cb5-5701fd565168" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.091 187643 DEBUG nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqaislyx1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0182ac12-7457-445c-9cb5-5701fd565168',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.091 187643 DEBUG nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Creating instance directory: /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.091 187643 DEBUG nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Creating disk.info with the contents: {'/var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk': 'qcow2', '/var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.092 187643 DEBUG nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.092 187643 DEBUG nova.objects.instance [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0182ac12-7457-445c-9cb5-5701fd565168 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.118 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.160 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.161 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.162 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.172 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.245 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.246 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.272 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.273 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.273 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.313 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.314 187643 DEBUG nova.virt.disk.api [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Checking if we can resize image /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.314 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.355 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.356 187643 DEBUG nova.virt.disk.api [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Cannot resize image /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.356 187643 DEBUG nova.objects.instance [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lazy-loading 'migration_context' on Instance uuid 0182ac12-7457-445c-9cb5-5701fd565168 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.372 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.386 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk.config 485376" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.387 187643 DEBUG nova.virt.libvirt.volume.remotefs [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk.config to /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.387 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk.config /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.775 187643 DEBUG oslo_concurrency.processutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168/disk.config /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168" returned: 0 in 0.387s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.775 187643 DEBUG nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.776 187643 DEBUG nova.virt.libvirt.vif [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:14:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-15256902',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-15256902',id=21,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:14:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-b3srcfdn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:14:25Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=0182ac12-7457-445c-9cb5-5701fd565168,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "address": "fa:16:3e:fc:57:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7a8a948c-6d", "ovs_interfaceid": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.777 187643 DEBUG nova.network.os_vif_util [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Converting VIF {"id": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "address": "fa:16:3e:fc:57:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7a8a948c-6d", "ovs_interfaceid": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.777 187643 DEBUG nova.network.os_vif_util [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:57:8a,bridge_name='br-int',has_traffic_filtering=True,id=7a8a948c-6d85-433a-ad75-a189da54f3f3,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a8a948c-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.778 187643 DEBUG os_vif [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:57:8a,bridge_name='br-int',has_traffic_filtering=True,id=7a8a948c-6d85-433a-ad75-a189da54f3f3,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a8a948c-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.778 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.778 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.779 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.781 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.781 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a8a948c-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.781 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a8a948c-6d, col_values=(('external_ids', {'iface-id': '7a8a948c-6d85-433a-ad75-a189da54f3f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:57:8a', 'vm-uuid': '0182ac12-7457-445c-9cb5-5701fd565168'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.813 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:50 compute-0 NetworkManager[57207]: <info>  [1771845290.8141] manager: (tap7a8a948c-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.817 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.822 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.823 187643 INFO os_vif [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:57:8a,bridge_name='br-int',has_traffic_filtering=True,id=7a8a948c-6d85-433a-ad75-a189da54f3f3,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a8a948c-6d')
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.823 187643 DEBUG nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 23 11:14:50 compute-0 nova_compute[187639]: 2026-02-23 11:14:50.824 187643 DEBUG nova.compute.manager [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqaislyx1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0182ac12-7457-445c-9cb5-5701fd565168',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 23 11:14:52 compute-0 nova_compute[187639]: 2026-02-23 11:14:52.191 187643 DEBUG nova.network.neutron [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Port 7a8a948c-6d85-433a-ad75-a189da54f3f3 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 23 11:14:52 compute-0 nova_compute[187639]: 2026-02-23 11:14:52.192 187643 DEBUG nova.compute.manager [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqaislyx1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0182ac12-7457-445c-9cb5-5701fd565168',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 23 11:14:52 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 23 11:14:52 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 23 11:14:52 compute-0 kernel: tap7a8a948c-6d: entered promiscuous mode
Feb 23 11:14:52 compute-0 NetworkManager[57207]: <info>  [1771845292.5018] manager: (tap7a8a948c-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Feb 23 11:14:52 compute-0 ovn_controller[97601]: 2026-02-23T11:14:52Z|00167|binding|INFO|Claiming lport 7a8a948c-6d85-433a-ad75-a189da54f3f3 for this additional chassis.
Feb 23 11:14:52 compute-0 ovn_controller[97601]: 2026-02-23T11:14:52Z|00168|binding|INFO|7a8a948c-6d85-433a-ad75-a189da54f3f3: Claiming fa:16:3e:fc:57:8a 10.100.0.9
Feb 23 11:14:52 compute-0 nova_compute[187639]: 2026-02-23 11:14:52.503 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:52 compute-0 ovn_controller[97601]: 2026-02-23T11:14:52Z|00169|binding|INFO|Setting lport 7a8a948c-6d85-433a-ad75-a189da54f3f3 ovn-installed in OVS
Feb 23 11:14:52 compute-0 nova_compute[187639]: 2026-02-23 11:14:52.509 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:52 compute-0 nova_compute[187639]: 2026-02-23 11:14:52.512 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:52 compute-0 systemd-machined[156970]: New machine qemu-16-instance-00000015.
Feb 23 11:14:52 compute-0 systemd-udevd[215233]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:14:52 compute-0 NetworkManager[57207]: <info>  [1771845292.5428] device (tap7a8a948c-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:14:52 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000015.
Feb 23 11:14:52 compute-0 NetworkManager[57207]: <info>  [1771845292.5433] device (tap7a8a948c-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:14:53 compute-0 nova_compute[187639]: 2026-02-23 11:14:53.472 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:53 compute-0 nova_compute[187639]: 2026-02-23 11:14:53.633 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845293.6328168, 0182ac12-7457-445c-9cb5-5701fd565168 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:14:53 compute-0 nova_compute[187639]: 2026-02-23 11:14:53.633 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] VM Started (Lifecycle Event)
Feb 23 11:14:53 compute-0 nova_compute[187639]: 2026-02-23 11:14:53.653 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:14:53 compute-0 ovn_controller[97601]: 2026-02-23T11:14:53Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:f4:6d 10.100.0.6
Feb 23 11:14:53 compute-0 ovn_controller[97601]: 2026-02-23T11:14:53Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:f4:6d 10.100.0.6
Feb 23 11:14:54 compute-0 nova_compute[187639]: 2026-02-23 11:14:54.264 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845294.2639017, 0182ac12-7457-445c-9cb5-5701fd565168 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:14:54 compute-0 nova_compute[187639]: 2026-02-23 11:14:54.264 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] VM Resumed (Lifecycle Event)
Feb 23 11:14:54 compute-0 nova_compute[187639]: 2026-02-23 11:14:54.284 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:14:54 compute-0 nova_compute[187639]: 2026-02-23 11:14:54.287 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:14:54 compute-0 nova_compute[187639]: 2026-02-23 11:14:54.306 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 23 11:14:55 compute-0 ovn_controller[97601]: 2026-02-23T11:14:55Z|00170|binding|INFO|Claiming lport 7a8a948c-6d85-433a-ad75-a189da54f3f3 for this chassis.
Feb 23 11:14:55 compute-0 ovn_controller[97601]: 2026-02-23T11:14:55Z|00171|binding|INFO|7a8a948c-6d85-433a-ad75-a189da54f3f3: Claiming fa:16:3e:fc:57:8a 10.100.0.9
Feb 23 11:14:55 compute-0 ovn_controller[97601]: 2026-02-23T11:14:55Z|00172|binding|INFO|Setting lport 7a8a948c-6d85-433a-ad75-a189da54f3f3 up in Southbound
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.370 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:57:8a 10.100.0.9'], port_security=['fa:16:3e:fc:57:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '0182ac12-7457-445c-9cb5-5701fd565168', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '11', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=7a8a948c-6d85-433a-ad75-a189da54f3f3) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.371 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 7a8a948c-6d85-433a-ad75-a189da54f3f3 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.373 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.386 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6399e345-5e51-4b9f-ba2b-5d37e11f9c5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.415 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[050412ed-764b-42f4-81cc-48c29983668f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.419 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[af602be1-0889-44d2-adb3-58de4fcd0e46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.441 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[8287ccae-bfa4-4c2f-95a8-851363828dc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.456 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[56c7e872-6452-42d0-af7a-84610ba326bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 742, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 742, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442404, 'reachable_time': 33170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215268, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.467 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6e799892-3497-4dfc-b782-85034a63564e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442412, 'tstamp': 442412}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215269, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442414, 'tstamp': 442414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215269, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.468 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:55 compute-0 nova_compute[187639]: 2026-02-23 11:14:55.470 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:55 compute-0 nova_compute[187639]: 2026-02-23 11:14:55.470 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.471 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.471 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.471 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:14:55 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:14:55.472 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:14:55 compute-0 nova_compute[187639]: 2026-02-23 11:14:55.690 187643 INFO nova.compute.manager [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Post operation of migration started
Feb 23 11:14:55 compute-0 nova_compute[187639]: 2026-02-23 11:14:55.813 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:56 compute-0 nova_compute[187639]: 2026-02-23 11:14:56.114 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "refresh_cache-0182ac12-7457-445c-9cb5-5701fd565168" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:14:56 compute-0 nova_compute[187639]: 2026-02-23 11:14:56.115 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquired lock "refresh_cache-0182ac12-7457-445c-9cb5-5701fd565168" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:14:56 compute-0 nova_compute[187639]: 2026-02-23 11:14:56.115 187643 DEBUG nova.network.neutron [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:14:56 compute-0 podman[215270]: 2026-02-23 11:14:56.861867909 +0000 UTC m=+0.058535068 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:14:57 compute-0 nova_compute[187639]: 2026-02-23 11:14:57.115 187643 DEBUG nova.network.neutron [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Updating instance_info_cache with network_info: [{"id": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "address": "fa:16:3e:fc:57:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a8a948c-6d", "ovs_interfaceid": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:14:57 compute-0 nova_compute[187639]: 2026-02-23 11:14:57.131 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Releasing lock "refresh_cache-0182ac12-7457-445c-9cb5-5701fd565168" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:14:57 compute-0 nova_compute[187639]: 2026-02-23 11:14:57.143 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:14:57 compute-0 nova_compute[187639]: 2026-02-23 11:14:57.144 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:14:57 compute-0 nova_compute[187639]: 2026-02-23 11:14:57.144 187643 DEBUG oslo_concurrency.lockutils [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:14:57 compute-0 nova_compute[187639]: 2026-02-23 11:14:57.150 187643 INFO nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 23 11:14:57 compute-0 virtqemud[186733]: Domain id=16 name='instance-00000015' uuid=0182ac12-7457-445c-9cb5-5701fd565168 is tainted: custom-monitor
Feb 23 11:14:58 compute-0 nova_compute[187639]: 2026-02-23 11:14:58.157 187643 INFO nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 23 11:14:58 compute-0 nova_compute[187639]: 2026-02-23 11:14:58.514 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:14:59 compute-0 nova_compute[187639]: 2026-02-23 11:14:59.163 187643 INFO nova.virt.libvirt.driver [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 23 11:14:59 compute-0 nova_compute[187639]: 2026-02-23 11:14:59.166 187643 DEBUG nova.compute.manager [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:14:59 compute-0 nova_compute[187639]: 2026-02-23 11:14:59.188 187643 DEBUG nova.objects.instance [None req-8b01e664-06c0-4b6c-b21f-4dd6acd805fa d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 23 11:14:59 compute-0 podman[197002]: time="2026-02-23T11:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:14:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:14:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2634 "" "Go-http-client/1.1"
Feb 23 11:15:00 compute-0 nova_compute[187639]: 2026-02-23 11:15:00.816 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:01 compute-0 openstack_network_exporter[199919]: ERROR   11:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:15:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:15:01 compute-0 openstack_network_exporter[199919]: ERROR   11:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:15:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:15:01 compute-0 podman[215295]: 2026-02-23 11:15:01.840054352 +0000 UTC m=+0.042352984 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:15:03 compute-0 nova_compute[187639]: 2026-02-23 11:15:03.547 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:03 compute-0 nova_compute[187639]: 2026-02-23 11:15:03.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:05 compute-0 nova_compute[187639]: 2026-02-23 11:15:05.818 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.023 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.024 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.024 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.024 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.024 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.025 187643 INFO nova.compute.manager [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Terminating instance
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.026 187643 DEBUG nova.compute.manager [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:15:06 compute-0 kernel: tapbad61a3a-6f (unregistering): left promiscuous mode
Feb 23 11:15:06 compute-0 NetworkManager[57207]: <info>  [1771845306.0552] device (tapbad61a3a-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:15:06 compute-0 ovn_controller[97601]: 2026-02-23T11:15:06Z|00173|binding|INFO|Releasing lport bad61a3a-6f81-430b-8c05-446e7f369295 from this chassis (sb_readonly=0)
Feb 23 11:15:06 compute-0 ovn_controller[97601]: 2026-02-23T11:15:06Z|00174|binding|INFO|Setting lport bad61a3a-6f81-430b-8c05-446e7f369295 down in Southbound
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.058 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 ovn_controller[97601]: 2026-02-23T11:15:06Z|00175|binding|INFO|Removing iface tapbad61a3a-6f ovn-installed in OVS
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.060 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.066 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:f4:6d 10.100.0.6'], port_security=['fa:16:3e:9b:f4:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7304eee8-2c91-4686-b99a-d3732086baaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=bad61a3a-6f81-430b-8c05-446e7f369295) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.067 106968 INFO neutron.agent.ovn.metadata.agent [-] Port bad61a3a-6f81-430b-8c05-446e7f369295 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.068 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.070 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.079 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0d89ff-0220-4d04-987c-21f7729d5582]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:06 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Deactivated successfully.
Feb 23 11:15:06 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Consumed 13.083s CPU time.
Feb 23 11:15:06 compute-0 systemd-machined[156970]: Machine qemu-15-instance-00000016 terminated.
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.095 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[d449cf0c-9208-42dd-beae-0ded21ffdc1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.097 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d2bef7-f6cc-48f8-b3af-954b7590d94c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.117 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a119fb-fb01-4af9-9ce6-a1905dd0df78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.127 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[019e41a3-9312-4417-9bcd-64f1125b0731]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442404, 'reachable_time': 33170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215329, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.135 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[42826063-2096-4f93-91ac-86de519d8c11]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442412, 'tstamp': 442412}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215330, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b12da8d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442414, 'tstamp': 442414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215330, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.137 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.138 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.141 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.141 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.141 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.142 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:15:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:06.142 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.208 187643 DEBUG nova.compute.manager [req-ca8aee54-7b30-41ed-b988-c3a180c3d388 req-bf57f935-a91a-40a6-8a23-f2f4393e9260 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received event network-vif-unplugged-bad61a3a-6f81-430b-8c05-446e7f369295 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.208 187643 DEBUG oslo_concurrency.lockutils [req-ca8aee54-7b30-41ed-b988-c3a180c3d388 req-bf57f935-a91a-40a6-8a23-f2f4393e9260 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.209 187643 DEBUG oslo_concurrency.lockutils [req-ca8aee54-7b30-41ed-b988-c3a180c3d388 req-bf57f935-a91a-40a6-8a23-f2f4393e9260 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.209 187643 DEBUG oslo_concurrency.lockutils [req-ca8aee54-7b30-41ed-b988-c3a180c3d388 req-bf57f935-a91a-40a6-8a23-f2f4393e9260 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.209 187643 DEBUG nova.compute.manager [req-ca8aee54-7b30-41ed-b988-c3a180c3d388 req-bf57f935-a91a-40a6-8a23-f2f4393e9260 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] No waiting events found dispatching network-vif-unplugged-bad61a3a-6f81-430b-8c05-446e7f369295 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.209 187643 DEBUG nova.compute.manager [req-ca8aee54-7b30-41ed-b988-c3a180c3d388 req-bf57f935-a91a-40a6-8a23-f2f4393e9260 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received event network-vif-unplugged-bad61a3a-6f81-430b-8c05-446e7f369295 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.269 187643 INFO nova.virt.libvirt.driver [-] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Instance destroyed successfully.
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.269 187643 DEBUG nova.objects.instance [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid 7304eee8-2c91-4686-b99a-d3732086baaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.280 187643 DEBUG nova.virt.libvirt.vif [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:14:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1703562665',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1703562665',id=22,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:14:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-ah98ec5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:14:41Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=7304eee8-2c91-4686-b99a-d3732086baaf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.280 187643 DEBUG nova.network.os_vif_util [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "bad61a3a-6f81-430b-8c05-446e7f369295", "address": "fa:16:3e:9b:f4:6d", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad61a3a-6f", "ovs_interfaceid": "bad61a3a-6f81-430b-8c05-446e7f369295", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.281 187643 DEBUG nova.network.os_vif_util [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f4:6d,bridge_name='br-int',has_traffic_filtering=True,id=bad61a3a-6f81-430b-8c05-446e7f369295,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad61a3a-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.281 187643 DEBUG os_vif [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f4:6d,bridge_name='br-int',has_traffic_filtering=True,id=bad61a3a-6f81-430b-8c05-446e7f369295,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad61a3a-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.282 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.282 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbad61a3a-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.283 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.284 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.286 187643 INFO os_vif [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f4:6d,bridge_name='br-int',has_traffic_filtering=True,id=bad61a3a-6f81-430b-8c05-446e7f369295,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad61a3a-6f')
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.286 187643 INFO nova.virt.libvirt.driver [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Deleting instance files /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf_del
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.287 187643 INFO nova.virt.libvirt.driver [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Deletion of /var/lib/nova/instances/7304eee8-2c91-4686-b99a-d3732086baaf_del complete
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.326 187643 INFO nova.compute.manager [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Took 0.30 seconds to destroy the instance on the hypervisor.
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.327 187643 DEBUG oslo.service.loopingcall [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.327 187643 DEBUG nova.compute.manager [-] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.327 187643 DEBUG nova.network.neutron [-] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.883 187643 DEBUG nova.network.neutron [-] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.896 187643 INFO nova.compute.manager [-] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Took 0.57 seconds to deallocate network for instance.
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.956 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:06 compute-0 nova_compute[187639]: 2026-02-23 11:15:06.956 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.024 187643 DEBUG nova.compute.provider_tree [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.045 187643 DEBUG nova.scheduler.client.report [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.068 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.086 187643 INFO nova.scheduler.client.report [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance 7304eee8-2c91-4686-b99a-d3732086baaf
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.141 187643 DEBUG oslo_concurrency.lockutils [None req-65a4636f-3875-44fa-a868-5a499925208e 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.939 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "0182ac12-7457-445c-9cb5-5701fd565168" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.940 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.940 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "0182ac12-7457-445c-9cb5-5701fd565168-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.940 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.940 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.941 187643 INFO nova.compute.manager [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Terminating instance
Feb 23 11:15:07 compute-0 nova_compute[187639]: 2026-02-23 11:15:07.942 187643 DEBUG nova.compute.manager [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:15:07 compute-0 kernel: tap7a8a948c-6d (unregistering): left promiscuous mode
Feb 23 11:15:07 compute-0 NetworkManager[57207]: <info>  [1771845307.9729] device (tap7a8a948c-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:15:08 compute-0 ovn_controller[97601]: 2026-02-23T11:15:08Z|00176|binding|INFO|Releasing lport 7a8a948c-6d85-433a-ad75-a189da54f3f3 from this chassis (sb_readonly=0)
Feb 23 11:15:08 compute-0 ovn_controller[97601]: 2026-02-23T11:15:08Z|00177|binding|INFO|Setting lport 7a8a948c-6d85-433a-ad75-a189da54f3f3 down in Southbound
Feb 23 11:15:08 compute-0 ovn_controller[97601]: 2026-02-23T11:15:08Z|00178|binding|INFO|Removing iface tap7a8a948c-6d ovn-installed in OVS
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.002 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.005 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.008 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:57:8a 10.100.0.9'], port_security=['fa:16:3e:fc:57:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '0182ac12-7457-445c-9cb5-5701fd565168', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '13', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=7a8a948c-6d85-433a-ad75-a189da54f3f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.010 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 7a8a948c-6d85-433a-ad75-a189da54f3f3 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.013 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.014 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[839a6079-b822-42bc-81d6-4e18be3ed499]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.015 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace which is not needed anymore
Feb 23 11:15:08 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Deactivated successfully.
Feb 23 11:15:08 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Consumed 1.876s CPU time.
Feb 23 11:15:08 compute-0 systemd-machined[156970]: Machine qemu-16-instance-00000015 terminated.
Feb 23 11:15:08 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215107]: [NOTICE]   (215111) : haproxy version is 2.8.14-c23fe91
Feb 23 11:15:08 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215107]: [NOTICE]   (215111) : path to executable is /usr/sbin/haproxy
Feb 23 11:15:08 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215107]: [WARNING]  (215111) : Exiting Master process...
Feb 23 11:15:08 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215107]: [ALERT]    (215111) : Current worker (215113) exited with code 143 (Terminated)
Feb 23 11:15:08 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215107]: [WARNING]  (215111) : All workers exited. Exiting... (0)
Feb 23 11:15:08 compute-0 systemd[1]: libpod-ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12.scope: Deactivated successfully.
Feb 23 11:15:08 compute-0 podman[215372]: 2026-02-23 11:15:08.11649881 +0000 UTC m=+0.042295702 container died ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 11:15:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12-userdata-shm.mount: Deactivated successfully.
Feb 23 11:15:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-72267b0a2107b2d21e67cc6226c19e88b9368910982d96bf1d1d17f65274659b-merged.mount: Deactivated successfully.
Feb 23 11:15:08 compute-0 podman[215372]: 2026-02-23 11:15:08.150477553 +0000 UTC m=+0.076274445 container cleanup ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:15:08 compute-0 systemd[1]: libpod-conmon-ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12.scope: Deactivated successfully.
Feb 23 11:15:08 compute-0 NetworkManager[57207]: <info>  [1771845308.1600] manager: (tap7a8a948c-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.161 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.166 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.196 187643 INFO nova.virt.libvirt.driver [-] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Instance destroyed successfully.
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.196 187643 DEBUG nova.objects.instance [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'resources' on Instance uuid 0182ac12-7457-445c-9cb5-5701fd565168 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.212 187643 DEBUG nova.virt.libvirt.vif [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-23T11:14:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-15256902',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-15256902',id=21,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:14:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-b3srcfdn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:14:59Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=0182ac12-7457-445c-9cb5-5701fd565168,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "address": "fa:16:3e:fc:57:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a8a948c-6d", "ovs_interfaceid": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.213 187643 DEBUG nova.network.os_vif_util [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "address": "fa:16:3e:fc:57:8a", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a8a948c-6d", "ovs_interfaceid": "7a8a948c-6d85-433a-ad75-a189da54f3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.213 187643 DEBUG nova.network.os_vif_util [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fc:57:8a,bridge_name='br-int',has_traffic_filtering=True,id=7a8a948c-6d85-433a-ad75-a189da54f3f3,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a8a948c-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.213 187643 DEBUG os_vif [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:57:8a,bridge_name='br-int',has_traffic_filtering=True,id=7a8a948c-6d85-433a-ad75-a189da54f3f3,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a8a948c-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.215 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.215 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a8a948c-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:15:08 compute-0 podman[215405]: 2026-02-23 11:15:08.215259505 +0000 UTC m=+0.045106166 container remove ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.216 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.218 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.221 187643 INFO os_vif [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:57:8a,bridge_name='br-int',has_traffic_filtering=True,id=7a8a948c-6d85-433a-ad75-a189da54f3f3,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a8a948c-6d')
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.221 187643 INFO nova.virt.libvirt.driver [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Deleting instance files /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168_del
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.220 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f64699-ce76-484b-934b-b88970f44f54]: (4, ('Mon Feb 23 11:15:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12)\nac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12\nMon Feb 23 11:15:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (ac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12)\nac299f717c20f0da47655b7a8cd9fd27b83231386943bcd4bc56384e49c40f12\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.222 187643 INFO nova.virt.libvirt.driver [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Deletion of /var/lib/nova/instances/0182ac12-7457-445c-9cb5-5701fd565168_del complete
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.223 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0a82e8a0-7ef3-4f69-8b36-d0769b594f9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.224 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.225 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 kernel: tap4b12da8d-30: left promiscuous mode
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.227 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.231 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.233 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[cb9b86a7-c766-44c3-ad55-eb0dc3b985c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.245 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5218e3-5e84-476d-9f72-cef42fd9b6e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.246 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f096c3-d694-4d29-87ba-38a82f10f0da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.261 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1e5fac-554e-491a-a333-3b9d57b735ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442399, 'reachable_time': 43450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215432, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.264 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:15:08 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:08.264 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[cc73adcc-731c-40c0-9658-99ba74d1bea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:15:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b12da8d\x2d3150\x2d4d44\x2db948\x2d8d49ddadedef.mount: Deactivated successfully.
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.272 187643 INFO nova.compute.manager [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.272 187643 DEBUG oslo.service.loopingcall [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.273 187643 DEBUG nova.compute.manager [-] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.273 187643 DEBUG nova.network.neutron [-] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.331 187643 DEBUG nova.compute.manager [req-59e255d5-1410-43ad-a70b-779c489b9695 req-6c06fcac-98c7-4085-8029-a2f5d6c71dbe 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received event network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.332 187643 DEBUG oslo_concurrency.lockutils [req-59e255d5-1410-43ad-a70b-779c489b9695 req-6c06fcac-98c7-4085-8029-a2f5d6c71dbe 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.332 187643 DEBUG oslo_concurrency.lockutils [req-59e255d5-1410-43ad-a70b-779c489b9695 req-6c06fcac-98c7-4085-8029-a2f5d6c71dbe 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.332 187643 DEBUG oslo_concurrency.lockutils [req-59e255d5-1410-43ad-a70b-779c489b9695 req-6c06fcac-98c7-4085-8029-a2f5d6c71dbe 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "7304eee8-2c91-4686-b99a-d3732086baaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.332 187643 DEBUG nova.compute.manager [req-59e255d5-1410-43ad-a70b-779c489b9695 req-6c06fcac-98c7-4085-8029-a2f5d6c71dbe 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] No waiting events found dispatching network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.333 187643 WARNING nova.compute.manager [req-59e255d5-1410-43ad-a70b-779c489b9695 req-6c06fcac-98c7-4085-8029-a2f5d6c71dbe 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received unexpected event network-vif-plugged-bad61a3a-6f81-430b-8c05-446e7f369295 for instance with vm_state deleted and task_state None.
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.333 187643 DEBUG nova.compute.manager [req-59e255d5-1410-43ad-a70b-779c489b9695 req-6c06fcac-98c7-4085-8029-a2f5d6c71dbe 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Received event network-vif-deleted-bad61a3a-6f81-430b-8c05-446e7f369295 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.550 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.709 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.710 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.918 187643 DEBUG nova.network.neutron [-] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.935 187643 INFO nova.compute.manager [-] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Took 0.66 seconds to deallocate network for instance.
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.988 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.988 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:08 compute-0 nova_compute[187639]: 2026-02-23 11:15:08.994 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:09 compute-0 nova_compute[187639]: 2026-02-23 11:15:09.020 187643 INFO nova.scheduler.client.report [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Deleted allocations for instance 0182ac12-7457-445c-9cb5-5701fd565168
Feb 23 11:15:09 compute-0 nova_compute[187639]: 2026-02-23 11:15:09.107 187643 DEBUG oslo_concurrency.lockutils [None req-bce1eb26-0807-4554-b06b-52019e4b5351 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:09 compute-0 nova_compute[187639]: 2026-02-23 11:15:09.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:09 compute-0 podman[215433]: 2026-02-23 11:15:09.892283405 +0000 UTC m=+0.088528307 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.465 187643 DEBUG nova.compute.manager [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Received event network-vif-unplugged-7a8a948c-6d85-433a-ad75-a189da54f3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.465 187643 DEBUG oslo_concurrency.lockutils [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0182ac12-7457-445c-9cb5-5701fd565168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.466 187643 DEBUG oslo_concurrency.lockutils [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.466 187643 DEBUG oslo_concurrency.lockutils [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.466 187643 DEBUG nova.compute.manager [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] No waiting events found dispatching network-vif-unplugged-7a8a948c-6d85-433a-ad75-a189da54f3f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.467 187643 WARNING nova.compute.manager [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Received unexpected event network-vif-unplugged-7a8a948c-6d85-433a-ad75-a189da54f3f3 for instance with vm_state deleted and task_state None.
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.467 187643 DEBUG nova.compute.manager [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Received event network-vif-plugged-7a8a948c-6d85-433a-ad75-a189da54f3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.467 187643 DEBUG oslo_concurrency.lockutils [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "0182ac12-7457-445c-9cb5-5701fd565168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.467 187643 DEBUG oslo_concurrency.lockutils [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.468 187643 DEBUG oslo_concurrency.lockutils [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "0182ac12-7457-445c-9cb5-5701fd565168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.468 187643 DEBUG nova.compute.manager [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] No waiting events found dispatching network-vif-plugged-7a8a948c-6d85-433a-ad75-a189da54f3f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.468 187643 WARNING nova.compute.manager [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Received unexpected event network-vif-plugged-7a8a948c-6d85-433a-ad75-a189da54f3f3 for instance with vm_state deleted and task_state None.
Feb 23 11:15:10 compute-0 nova_compute[187639]: 2026-02-23 11:15:10.468 187643 DEBUG nova.compute.manager [req-5e8642a4-780f-4fc6-8a54-7ff968c16add req-950848ca-1208-43e1-a850-546c8248cfcd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Received event network-vif-deleted-7a8a948c-6d85-433a-ad75-a189da54f3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:15:11 compute-0 nova_compute[187639]: 2026-02-23 11:15:11.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:12.661 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:12.661 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:15:12.661 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.219 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.575 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.726 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.727 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.727 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.727 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:15:13 compute-0 podman[215461]: 2026-02-23 11:15:13.854260357 +0000 UTC m=+0.087153181 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347)
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.919 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.921 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5777MB free_disk=73.20487594604492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.921 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.922 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.984 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:15:13 compute-0 nova_compute[187639]: 2026-02-23 11:15:13.984 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:15:14 compute-0 nova_compute[187639]: 2026-02-23 11:15:14.007 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:15:14 compute-0 nova_compute[187639]: 2026-02-23 11:15:14.026 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:15:14 compute-0 nova_compute[187639]: 2026-02-23 11:15:14.058 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:15:14 compute-0 nova_compute[187639]: 2026-02-23 11:15:14.058 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:15:15 compute-0 sshd-session[215482]: Invalid user admin from 143.198.30.3 port 46442
Feb 23 11:15:15 compute-0 sshd-session[215482]: Connection closed by invalid user admin 143.198.30.3 port 46442 [preauth]
Feb 23 11:15:17 compute-0 nova_compute[187639]: 2026-02-23 11:15:17.060 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:15:18 compute-0 nova_compute[187639]: 2026-02-23 11:15:18.228 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:18 compute-0 nova_compute[187639]: 2026-02-23 11:15:18.614 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:21 compute-0 nova_compute[187639]: 2026-02-23 11:15:21.269 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845306.2678692, 7304eee8-2c91-4686-b99a-d3732086baaf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:15:21 compute-0 nova_compute[187639]: 2026-02-23 11:15:21.269 187643 INFO nova.compute.manager [-] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] VM Stopped (Lifecycle Event)
Feb 23 11:15:21 compute-0 nova_compute[187639]: 2026-02-23 11:15:21.304 187643 DEBUG nova.compute.manager [None req-33b8cdf3-10cc-4040-acc1-9a9ba7e9f780 - - - - - -] [instance: 7304eee8-2c91-4686-b99a-d3732086baaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:15:23 compute-0 nova_compute[187639]: 2026-02-23 11:15:23.196 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845308.1932104, 0182ac12-7457-445c-9cb5-5701fd565168 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:15:23 compute-0 nova_compute[187639]: 2026-02-23 11:15:23.196 187643 INFO nova.compute.manager [-] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] VM Stopped (Lifecycle Event)
Feb 23 11:15:23 compute-0 nova_compute[187639]: 2026-02-23 11:15:23.215 187643 DEBUG nova.compute.manager [None req-4451b31a-cf79-4389-9d35-e2682acb0ec9 - - - - - -] [instance: 0182ac12-7457-445c-9cb5-5701fd565168] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:15:23 compute-0 nova_compute[187639]: 2026-02-23 11:15:23.231 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:23 compute-0 nova_compute[187639]: 2026-02-23 11:15:23.664 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:27 compute-0 podman[215484]: 2026-02-23 11:15:27.835080633 +0000 UTC m=+0.041611325 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:15:28 compute-0 nova_compute[187639]: 2026-02-23 11:15:28.234 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:28 compute-0 nova_compute[187639]: 2026-02-23 11:15:28.708 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:29 compute-0 podman[197002]: time="2026-02-23T11:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:15:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:15:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 23 11:15:31 compute-0 openstack_network_exporter[199919]: ERROR   11:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:15:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:15:31 compute-0 openstack_network_exporter[199919]: ERROR   11:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:15:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:15:32 compute-0 sshd-session[215508]: Invalid user admin from 165.227.79.48 port 47164
Feb 23 11:15:32 compute-0 sshd-session[215508]: Connection closed by invalid user admin 165.227.79.48 port 47164 [preauth]
Feb 23 11:15:32 compute-0 podman[215510]: 2026-02-23 11:15:32.06180993 +0000 UTC m=+0.044097379 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 11:15:33 compute-0 nova_compute[187639]: 2026-02-23 11:15:33.237 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:33 compute-0 nova_compute[187639]: 2026-02-23 11:15:33.754 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:38 compute-0 nova_compute[187639]: 2026-02-23 11:15:38.284 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:38 compute-0 nova_compute[187639]: 2026-02-23 11:15:38.756 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:38 compute-0 ovn_controller[97601]: 2026-02-23T11:15:38Z|00179|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 23 11:15:40 compute-0 podman[215529]: 2026-02-23 11:15:40.983494518 +0000 UTC m=+0.176695433 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 11:15:43 compute-0 nova_compute[187639]: 2026-02-23 11:15:43.287 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:43 compute-0 nova_compute[187639]: 2026-02-23 11:15:43.793 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:44 compute-0 podman[215556]: 2026-02-23 11:15:44.894773035 +0000 UTC m=+0.095819380 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7)
Feb 23 11:15:46 compute-0 sshd-session[215578]: Invalid user admin from 143.198.30.3 port 59402
Feb 23 11:15:46 compute-0 sshd-session[215578]: Connection closed by invalid user admin 143.198.30.3 port 59402 [preauth]
Feb 23 11:15:48 compute-0 nova_compute[187639]: 2026-02-23 11:15:48.290 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:48 compute-0 nova_compute[187639]: 2026-02-23 11:15:48.858 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:53 compute-0 nova_compute[187639]: 2026-02-23 11:15:53.295 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:53 compute-0 nova_compute[187639]: 2026-02-23 11:15:53.900 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:55 compute-0 sshd-session[215580]: Invalid user admin from 80.94.95.116 port 59792
Feb 23 11:15:55 compute-0 sshd-session[215580]: Connection closed by invalid user admin 80.94.95.116 port 59792 [preauth]
Feb 23 11:15:58 compute-0 nova_compute[187639]: 2026-02-23 11:15:58.299 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:58 compute-0 podman[215583]: 2026-02-23 11:15:58.83879745 +0000 UTC m=+0.044463659 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:15:58 compute-0 nova_compute[187639]: 2026-02-23 11:15:58.901 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:15:59 compute-0 podman[197002]: time="2026-02-23T11:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:15:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:15:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 23 11:16:01 compute-0 openstack_network_exporter[199919]: ERROR   11:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:16:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:16:01 compute-0 openstack_network_exporter[199919]: ERROR   11:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:16:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:16:02 compute-0 podman[215607]: 2026-02-23 11:16:02.891833321 +0000 UTC m=+0.087928412 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 23 11:16:03 compute-0 nova_compute[187639]: 2026-02-23 11:16:03.302 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:03 compute-0 nova_compute[187639]: 2026-02-23 11:16:03.903 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:05 compute-0 nova_compute[187639]: 2026-02-23 11:16:05.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:07 compute-0 nova_compute[187639]: 2026-02-23 11:16:07.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:08 compute-0 nova_compute[187639]: 2026-02-23 11:16:08.305 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:08 compute-0 nova_compute[187639]: 2026-02-23 11:16:08.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:08 compute-0 nova_compute[187639]: 2026-02-23 11:16:08.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:08 compute-0 nova_compute[187639]: 2026-02-23 11:16:08.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:16:08 compute-0 nova_compute[187639]: 2026-02-23 11:16:08.960 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:09 compute-0 nova_compute[187639]: 2026-02-23 11:16:09.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:09 compute-0 nova_compute[187639]: 2026-02-23 11:16:09.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:16:09 compute-0 nova_compute[187639]: 2026-02-23 11:16:09.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:16:09 compute-0 nova_compute[187639]: 2026-02-23 11:16:09.712 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:16:10 compute-0 nova_compute[187639]: 2026-02-23 11:16:10.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:11 compute-0 nova_compute[187639]: 2026-02-23 11:16:11.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:11 compute-0 podman[215627]: 2026-02-23 11:16:11.899407065 +0000 UTC m=+0.102484295 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:16:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:12.662 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:12.662 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:12.663 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:13 compute-0 nova_compute[187639]: 2026-02-23 11:16:13.308 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:13 compute-0 nova_compute[187639]: 2026-02-23 11:16:13.961 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.719 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.720 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.720 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.720 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.830 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.831 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5820MB free_disk=73.20487594604492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.831 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.831 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:15 compute-0 podman[215653]: 2026-02-23 11:16:15.845000181 +0000 UTC m=+0.054146394 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64)
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.887 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.888 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.903 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.927 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.927 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.945 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.970 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 11:16:15 compute-0 nova_compute[187639]: 2026-02-23 11:16:15.998 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:16:16 compute-0 nova_compute[187639]: 2026-02-23 11:16:16.011 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:16:16 compute-0 nova_compute[187639]: 2026-02-23 11:16:16.013 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:16:16 compute-0 nova_compute[187639]: 2026-02-23 11:16:16.013 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:16 compute-0 sshd-session[215674]: Invalid user admin from 165.227.79.48 port 42546
Feb 23 11:16:16 compute-0 sshd-session[215674]: Connection closed by invalid user admin 165.227.79.48 port 42546 [preauth]
Feb 23 11:16:16 compute-0 sshd-session[215676]: Invalid user admin from 143.198.30.3 port 39230
Feb 23 11:16:16 compute-0 sshd-session[215676]: Connection closed by invalid user admin 143.198.30.3 port 39230 [preauth]
Feb 23 11:16:17 compute-0 nova_compute[187639]: 2026-02-23 11:16:17.817 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:17 compute-0 nova_compute[187639]: 2026-02-23 11:16:17.817 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:17 compute-0 nova_compute[187639]: 2026-02-23 11:16:17.834 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:16:17 compute-0 nova_compute[187639]: 2026-02-23 11:16:17.924 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:17 compute-0 nova_compute[187639]: 2026-02-23 11:16:17.925 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:17 compute-0 nova_compute[187639]: 2026-02-23 11:16:17.933 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:16:17 compute-0 nova_compute[187639]: 2026-02-23 11:16:17.933 187643 INFO nova.compute.claims [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.013 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.054 187643 DEBUG nova.compute.provider_tree [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.069 187643 DEBUG nova.scheduler.client.report [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.089 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.090 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.138 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.139 187643 DEBUG nova.network.neutron [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.156 187643 INFO nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.173 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.263 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.265 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.265 187643 INFO nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Creating image(s)
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.266 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "/var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.266 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.267 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "/var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.283 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.311 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.361 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.361 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.362 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.387 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.427 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.428 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.442 187643 DEBUG nova.policy [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48814d91aad6418f9d55fc9967ed0087', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.478 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.479 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.480 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.519 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.520 187643 DEBUG nova.virt.disk.api [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Checking if we can resize image /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.521 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.594 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.595 187643 DEBUG nova.virt.disk.api [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Cannot resize image /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.595 187643 DEBUG nova.objects.instance [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'migration_context' on Instance uuid 42b59c9e-2075-4426-aca3-d27c2ca4a97e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.618 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.618 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Ensure instance console log exists: /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.619 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.619 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.619 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:18.901 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.902 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:18.902 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:16:18 compute-0 nova_compute[187639]: 2026-02-23 11:16:18.962 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:19 compute-0 nova_compute[187639]: 2026-02-23 11:16:19.087 187643 DEBUG nova.network.neutron [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Successfully created port: 14d62fb0-c869-4b3c-8914-c1863a5aafe0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:16:21 compute-0 nova_compute[187639]: 2026-02-23 11:16:21.884 187643 DEBUG nova.network.neutron [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Successfully updated port: 14d62fb0-c869-4b3c-8914-c1863a5aafe0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:16:21 compute-0 nova_compute[187639]: 2026-02-23 11:16:21.900 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:16:21 compute-0 nova_compute[187639]: 2026-02-23 11:16:21.900 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquired lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:16:21 compute-0 nova_compute[187639]: 2026-02-23 11:16:21.900 187643 DEBUG nova.network.neutron [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:16:21 compute-0 nova_compute[187639]: 2026-02-23 11:16:21.968 187643 DEBUG nova.compute.manager [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-changed-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:16:21 compute-0 nova_compute[187639]: 2026-02-23 11:16:21.969 187643 DEBUG nova.compute.manager [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Refreshing instance network info cache due to event network-changed-14d62fb0-c869-4b3c-8914-c1863a5aafe0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:16:21 compute-0 nova_compute[187639]: 2026-02-23 11:16:21.969 187643 DEBUG oslo_concurrency.lockutils [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:16:22 compute-0 nova_compute[187639]: 2026-02-23 11:16:22.087 187643 DEBUG nova.network.neutron [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:16:23 compute-0 nova_compute[187639]: 2026-02-23 11:16:23.313 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:23 compute-0 nova_compute[187639]: 2026-02-23 11:16:23.965 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.402 187643 DEBUG nova.network.neutron [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updating instance_info_cache with network_info: [{"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.437 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Releasing lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.437 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Instance network_info: |[{"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.438 187643 DEBUG oslo_concurrency.lockutils [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.438 187643 DEBUG nova.network.neutron [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Refreshing network info cache for port 14d62fb0-c869-4b3c-8914-c1863a5aafe0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.441 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Start _get_guest_xml network_info=[{"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.445 187643 WARNING nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.450 187643 DEBUG nova.virt.libvirt.host [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.451 187643 DEBUG nova.virt.libvirt.host [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.461 187643 DEBUG nova.virt.libvirt.host [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.462 187643 DEBUG nova.virt.libvirt.host [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.463 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.463 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.464 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.464 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.464 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.464 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.465 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.465 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.465 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.466 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.466 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.466 187643 DEBUG nova.virt.hardware [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.469 187643 DEBUG nova.virt.libvirt.vif [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:16:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-10307845',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-10307845',id=23,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-rnnmsydt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:16:18Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=42b59c9e-2075-4426-aca3-d27c2ca4a97e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.470 187643 DEBUG nova.network.os_vif_util [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.470 187643 DEBUG nova.network.os_vif_util [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.471 187643 DEBUG nova.objects.instance [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 42b59c9e-2075-4426-aca3-d27c2ca4a97e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.484 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <uuid>42b59c9e-2075-4426-aca3-d27c2ca4a97e</uuid>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <name>instance-00000017</name>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteStrategies-server-10307845</nova:name>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:16:24</nova:creationTime>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:user uuid="48814d91aad6418f9d55fc9967ed0087">tempest-TestExecuteStrategies-126537390-project-member</nova:user>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:project uuid="5dfbb0ac693b4065ada17052ebb303dd">tempest-TestExecuteStrategies-126537390</nova:project>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         <nova:port uuid="14d62fb0-c869-4b3c-8914-c1863a5aafe0">
Feb 23 11:16:24 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <system>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <entry name="serial">42b59c9e-2075-4426-aca3-d27c2ca4a97e</entry>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <entry name="uuid">42b59c9e-2075-4426-aca3-d27c2ca4a97e</entry>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </system>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <os>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   </os>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <features>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   </features>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk.config"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:5f:3a:4e"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <target dev="tap14d62fb0-c8"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/console.log" append="off"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <video>
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </video>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:16:24 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:16:24 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:16:24 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:16:24 compute-0 nova_compute[187639]: </domain>
Feb 23 11:16:24 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.485 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Preparing to wait for external event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.485 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.486 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.486 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.486 187643 DEBUG nova.virt.libvirt.vif [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:16:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-10307845',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-10307845',id=23,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-rnnmsydt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:16:18Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=42b59c9e-2075-4426-aca3-d27c2ca4a97e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.487 187643 DEBUG nova.network.os_vif_util [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converting VIF {"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.487 187643 DEBUG nova.network.os_vif_util [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.488 187643 DEBUG os_vif [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.488 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.489 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.489 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.491 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.491 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14d62fb0-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.491 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap14d62fb0-c8, col_values=(('external_ids', {'iface-id': '14d62fb0-c869-4b3c-8914-c1863a5aafe0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:3a:4e', 'vm-uuid': '42b59c9e-2075-4426-aca3-d27c2ca4a97e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.531 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:24 compute-0 NetworkManager[57207]: <info>  [1771845384.5318] manager: (tap14d62fb0-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.534 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.537 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.538 187643 INFO os_vif [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8')
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.577 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.577 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.578 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] No VIF found with MAC fa:16:3e:5f:3a:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:16:24 compute-0 nova_compute[187639]: 2026-02-23 11:16:24.578 187643 INFO nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Using config drive
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.510 187643 INFO nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Creating config drive at /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk.config
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.513 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp_fp7e904 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.629 187643 DEBUG oslo_concurrency.processutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp_fp7e904" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:16:25 compute-0 kernel: tap14d62fb0-c8: entered promiscuous mode
Feb 23 11:16:25 compute-0 NetworkManager[57207]: <info>  [1771845385.6820] manager: (tap14d62fb0-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.681 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:25 compute-0 ovn_controller[97601]: 2026-02-23T11:16:25Z|00180|binding|INFO|Claiming lport 14d62fb0-c869-4b3c-8914-c1863a5aafe0 for this chassis.
Feb 23 11:16:25 compute-0 ovn_controller[97601]: 2026-02-23T11:16:25Z|00181|binding|INFO|14d62fb0-c869-4b3c-8914-c1863a5aafe0: Claiming fa:16:3e:5f:3a:4e 10.100.0.8
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:16:25 compute-0 ovn_controller[97601]: 2026-02-23T11:16:25Z|00182|binding|INFO|Setting lport 14d62fb0-c869-4b3c-8914-c1863a5aafe0 ovn-installed in OVS
Feb 23 11:16:25 compute-0 ovn_controller[97601]: 2026-02-23T11:16:25Z|00183|binding|INFO|Setting lport 14d62fb0-c869-4b3c-8914-c1863a5aafe0 up in Southbound
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.688 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:3a:4e 10.100.0.8'], port_security=['fa:16:3e:5f:3a:4e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '42b59c9e-2075-4426-aca3-d27c2ca4a97e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=14d62fb0-c869-4b3c-8914-c1863a5aafe0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.688 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.690 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 14d62fb0-c869-4b3c-8914-c1863a5aafe0 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef bound to our chassis
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.691 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.699 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e483988b-22bc-4b85-aa78-dcf36a142322]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.700 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b12da8d-31 in ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.702 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b12da8d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.702 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[90e0d66e-22b0-4370-ae26-630c6724d0b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.702 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7c945e-e727-4960-8af1-e196ec48be4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 systemd-machined[156970]: New machine qemu-17-instance-00000017.
Feb 23 11:16:25 compute-0 systemd-udevd[215714]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.710 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc7ea5e-99f5-4068-8262-35ddc8f083c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Feb 23 11:16:25 compute-0 NetworkManager[57207]: <info>  [1771845385.7178] device (tap14d62fb0-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:16:25 compute-0 NetworkManager[57207]: <info>  [1771845385.7187] device (tap14d62fb0-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.729 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e1896798-2b7a-4cee-9868-f0edd9b37c83]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.746 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[6194295f-6c2d-4279-b8d4-af1b14bcd269]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.750 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6bc41b21-c58e-46d4-aaf8-fa6eeb9d5f82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 NetworkManager[57207]: <info>  [1771845385.7510] manager: (tap4b12da8d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Feb 23 11:16:25 compute-0 systemd-udevd[215718]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.767 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[96121938-36fd-4f0e-8969-027ea5df4ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.769 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[c7759428-58b3-4813-9e5f-acc5d41eb5eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 NetworkManager[57207]: <info>  [1771845385.7854] device (tap4b12da8d-30): carrier: link connected
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.789 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[be49fe80-22ff-4ed8-a633-725dd9a34549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.800 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[013f7258-b729-49aa-bd63-848f711d8572]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452995, 'reachable_time': 41525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215746, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.812 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4a23b0-791a-4744-be16-dc627ae1a75e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8a8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452995, 'tstamp': 452995}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215747, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.821 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd37ca0-08fa-4b68-8a98-3871cdf82c49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b12da8d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8a:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452995, 'reachable_time': 41525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215748, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.848 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8f9927-db83-43cc-9cb1-61e87a5cad79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.885 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[09f193ff-3ece-4441-b69f-78cd938945d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.886 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.886 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.888 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b12da8d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:16:25 compute-0 NetworkManager[57207]: <info>  [1771845385.8901] manager: (tap4b12da8d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Feb 23 11:16:25 compute-0 kernel: tap4b12da8d-30: entered promiscuous mode
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.889 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.893 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.894 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b12da8d-30, col_values=(('external_ids', {'iface-id': '586378da-906d-4768-bab7-0954450c4a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.895 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:25 compute-0 ovn_controller[97601]: 2026-02-23T11:16:25Z|00184|binding|INFO|Releasing lport 586378da-906d-4768-bab7-0954450c4a57 from this chassis (sb_readonly=0)
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.901 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.902 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.904 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[df33462e-6cd4-434b-b3c8-c4e58645db5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.904 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/4b12da8d-3150-4d44-b948-8d49ddadedef.pid.haproxy
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 4b12da8d-3150-4d44-b948-8d49ddadedef
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:16:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:25.904 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'env', 'PROCESS_TAG=haproxy-4b12da8d-3150-4d44-b948-8d49ddadedef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b12da8d-3150-4d44-b948-8d49ddadedef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.907 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845385.9070895, 42b59c9e-2075-4426-aca3-d27c2ca4a97e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.907 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] VM Started (Lifecycle Event)
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.925 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.928 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845385.9071755, 42b59c9e-2075-4426-aca3-d27c2ca4a97e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.929 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] VM Paused (Lifecycle Event)
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.952 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.955 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:16:25 compute-0 nova_compute[187639]: 2026-02-23 11:16:25.978 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.196 187643 DEBUG nova.compute.manager [req-71973073-8d00-4197-9d1c-32195490c8a4 req-875e5f5c-26cd-426f-92af-1ff6fe118e63 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.196 187643 DEBUG oslo_concurrency.lockutils [req-71973073-8d00-4197-9d1c-32195490c8a4 req-875e5f5c-26cd-426f-92af-1ff6fe118e63 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.197 187643 DEBUG oslo_concurrency.lockutils [req-71973073-8d00-4197-9d1c-32195490c8a4 req-875e5f5c-26cd-426f-92af-1ff6fe118e63 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.197 187643 DEBUG oslo_concurrency.lockutils [req-71973073-8d00-4197-9d1c-32195490c8a4 req-875e5f5c-26cd-426f-92af-1ff6fe118e63 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:26 compute-0 podman[215787]: 2026-02-23 11:16:26.197389013 +0000 UTC m=+0.052035109 container create 143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.197 187643 DEBUG nova.compute.manager [req-71973073-8d00-4197-9d1c-32195490c8a4 req-875e5f5c-26cd-426f-92af-1ff6fe118e63 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Processing event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.198 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.201 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845386.2013416, 42b59c9e-2075-4426-aca3-d27c2ca4a97e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.202 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] VM Resumed (Lifecycle Event)
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.203 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.207 187643 INFO nova.virt.libvirt.driver [-] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Instance spawned successfully.
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.207 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.237 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.238 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.239 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.240 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.240 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.241 187643 DEBUG nova.virt.libvirt.driver [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:16:26 compute-0 systemd[1]: Started libpod-conmon-143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95.scope.
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.246 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.250 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:16:26 compute-0 podman[215787]: 2026-02-23 11:16:26.166405098 +0000 UTC m=+0.021051174 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:16:26 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:16:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c07521d2d4355900015b0828899b0f868d1158c2b82c6ce8b26982d5135d0d03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:16:26 compute-0 podman[215787]: 2026-02-23 11:16:26.280062896 +0000 UTC m=+0.134708962 container init 143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 11:16:26 compute-0 podman[215787]: 2026-02-23 11:16:26.28365448 +0000 UTC m=+0.138300536 container start 143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.294 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:16:26 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215802]: [NOTICE]   (215806) : New worker (215808) forked
Feb 23 11:16:26 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215802]: [NOTICE]   (215806) : Loading success.
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.324 187643 INFO nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Took 8.06 seconds to spawn the instance on the hypervisor.
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.325 187643 DEBUG nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.378 187643 INFO nova.compute.manager [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Took 8.48 seconds to build instance.
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.392 187643 DEBUG oslo_concurrency.lockutils [None req-b679db86-6c70-4b5d-8890-b0ba176e176d 48814d91aad6418f9d55fc9967ed0087 5dfbb0ac693b4065ada17052ebb303dd - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.558 187643 DEBUG nova.network.neutron [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updated VIF entry in instance network info cache for port 14d62fb0-c869-4b3c-8914-c1863a5aafe0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.559 187643 DEBUG nova.network.neutron [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updating instance_info_cache with network_info: [{"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:16:26 compute-0 nova_compute[187639]: 2026-02-23 11:16:26.578 187643 DEBUG oslo_concurrency.lockutils [req-211569fd-d090-4326-958d-df57c512818d req-579994d1-1a81-4026-b421-deebc56cda0e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:16:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:16:27.905 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:16:28 compute-0 nova_compute[187639]: 2026-02-23 11:16:28.288 187643 DEBUG nova.compute.manager [req-3251ddaa-8238-49c1-b5ee-6d6bb4b4c3ad req-5f028b48-8ccd-4895-8e77-b9301161095e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:16:28 compute-0 nova_compute[187639]: 2026-02-23 11:16:28.289 187643 DEBUG oslo_concurrency.lockutils [req-3251ddaa-8238-49c1-b5ee-6d6bb4b4c3ad req-5f028b48-8ccd-4895-8e77-b9301161095e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:16:28 compute-0 nova_compute[187639]: 2026-02-23 11:16:28.290 187643 DEBUG oslo_concurrency.lockutils [req-3251ddaa-8238-49c1-b5ee-6d6bb4b4c3ad req-5f028b48-8ccd-4895-8e77-b9301161095e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:16:28 compute-0 nova_compute[187639]: 2026-02-23 11:16:28.290 187643 DEBUG oslo_concurrency.lockutils [req-3251ddaa-8238-49c1-b5ee-6d6bb4b4c3ad req-5f028b48-8ccd-4895-8e77-b9301161095e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:16:28 compute-0 nova_compute[187639]: 2026-02-23 11:16:28.291 187643 DEBUG nova.compute.manager [req-3251ddaa-8238-49c1-b5ee-6d6bb4b4c3ad req-5f028b48-8ccd-4895-8e77-b9301161095e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:16:28 compute-0 nova_compute[187639]: 2026-02-23 11:16:28.291 187643 WARNING nova.compute.manager [req-3251ddaa-8238-49c1-b5ee-6d6bb4b4c3ad req-5f028b48-8ccd-4895-8e77-b9301161095e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received unexpected event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with vm_state active and task_state None.
Feb 23 11:16:28 compute-0 nova_compute[187639]: 2026-02-23 11:16:28.967 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:29 compute-0 nova_compute[187639]: 2026-02-23 11:16:29.533 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:29 compute-0 podman[197002]: time="2026-02-23T11:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:16:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:16:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 23 11:16:29 compute-0 podman[215817]: 2026-02-23 11:16:29.855533154 +0000 UTC m=+0.060437320 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:16:31 compute-0 openstack_network_exporter[199919]: ERROR   11:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:16:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:16:31 compute-0 openstack_network_exporter[199919]: ERROR   11:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:16:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:16:33 compute-0 podman[215840]: 2026-02-23 11:16:33.834647881 +0000 UTC m=+0.042551529 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 11:16:33 compute-0 nova_compute[187639]: 2026-02-23 11:16:33.969 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:34 compute-0 nova_compute[187639]: 2026-02-23 11:16:34.537 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:37 compute-0 ovn_controller[97601]: 2026-02-23T11:16:37Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:3a:4e 10.100.0.8
Feb 23 11:16:37 compute-0 ovn_controller[97601]: 2026-02-23T11:16:37Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:3a:4e 10.100.0.8
Feb 23 11:16:38 compute-0 nova_compute[187639]: 2026-02-23 11:16:38.983 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:39 compute-0 nova_compute[187639]: 2026-02-23 11:16:39.539 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:42 compute-0 podman[215874]: 2026-02-23 11:16:42.928348543 +0000 UTC m=+0.130944733 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:16:44 compute-0 nova_compute[187639]: 2026-02-23 11:16:44.027 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:44 compute-0 nova_compute[187639]: 2026-02-23 11:16:44.541 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:46 compute-0 podman[215901]: 2026-02-23 11:16:46.849440624 +0000 UTC m=+0.048865555 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 11:16:47 compute-0 sshd-session[215922]: Invalid user user from 143.198.30.3 port 42080
Feb 23 11:16:47 compute-0 sshd-session[215922]: Connection closed by invalid user user 143.198.30.3 port 42080 [preauth]
Feb 23 11:16:49 compute-0 nova_compute[187639]: 2026-02-23 11:16:49.030 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:49 compute-0 nova_compute[187639]: 2026-02-23 11:16:49.543 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:54 compute-0 nova_compute[187639]: 2026-02-23 11:16:54.031 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:54 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 23 11:16:54 compute-0 nova_compute[187639]: 2026-02-23 11:16:54.544 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:59 compute-0 nova_compute[187639]: 2026-02-23 11:16:59.032 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:59 compute-0 nova_compute[187639]: 2026-02-23 11:16:59.547 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:16:59 compute-0 podman[197002]: time="2026-02-23T11:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:16:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:16:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 23 11:17:00 compute-0 podman[215926]: 2026-02-23 11:17:00.857061021 +0000 UTC m=+0.056455815 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:17:01 compute-0 openstack_network_exporter[199919]: ERROR   11:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:17:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:17:01 compute-0 openstack_network_exporter[199919]: ERROR   11:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:17:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:17:02 compute-0 sshd-session[215950]: Invalid user admin from 165.227.79.48 port 38866
Feb 23 11:17:02 compute-0 sshd-session[215950]: Connection closed by invalid user admin 165.227.79.48 port 38866 [preauth]
Feb 23 11:17:03 compute-0 ovn_controller[97601]: 2026-02-23T11:17:03Z|00185|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 23 11:17:04 compute-0 nova_compute[187639]: 2026-02-23 11:17:04.034 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:04 compute-0 nova_compute[187639]: 2026-02-23 11:17:04.549 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:04 compute-0 podman[215952]: 2026-02-23 11:17:04.834294489 +0000 UTC m=+0.040494915 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 11:17:05 compute-0 nova_compute[187639]: 2026-02-23 11:17:05.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:08 compute-0 nova_compute[187639]: 2026-02-23 11:17:08.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:09 compute-0 nova_compute[187639]: 2026-02-23 11:17:09.080 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:09 compute-0 nova_compute[187639]: 2026-02-23 11:17:09.550 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:09 compute-0 nova_compute[187639]: 2026-02-23 11:17:09.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:09 compute-0 nova_compute[187639]: 2026-02-23 11:17:09.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:17:10 compute-0 nova_compute[187639]: 2026-02-23 11:17:10.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:10 compute-0 nova_compute[187639]: 2026-02-23 11:17:10.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:17:10 compute-0 nova_compute[187639]: 2026-02-23 11:17:10.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:17:11 compute-0 nova_compute[187639]: 2026-02-23 11:17:11.247 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:17:11 compute-0 nova_compute[187639]: 2026-02-23 11:17:11.247 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:17:11 compute-0 nova_compute[187639]: 2026-02-23 11:17:11.247 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:17:11 compute-0 nova_compute[187639]: 2026-02-23 11:17:11.248 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 42b59c9e-2075-4426-aca3-d27c2ca4a97e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:17:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:12.663 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:12.664 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:12.665 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:13 compute-0 nova_compute[187639]: 2026-02-23 11:17:13.660 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updating instance_info_cache with network_info: [{"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:17:13 compute-0 nova_compute[187639]: 2026-02-23 11:17:13.687 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:17:13 compute-0 nova_compute[187639]: 2026-02-23 11:17:13.688 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:17:13 compute-0 nova_compute[187639]: 2026-02-23 11:17:13.688 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:13 compute-0 nova_compute[187639]: 2026-02-23 11:17:13.689 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:13 compute-0 podman[215973]: 2026-02-23 11:17:13.93603474 +0000 UTC m=+0.136207601 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 11:17:14 compute-0 nova_compute[187639]: 2026-02-23 11:17:14.122 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:14 compute-0 nova_compute[187639]: 2026-02-23 11:17:14.552 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:14 compute-0 nova_compute[187639]: 2026-02-23 11:17:14.967 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Check if temp file /var/lib/nova/instances/tmpfmbwyego exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 23 11:17:14 compute-0 nova_compute[187639]: 2026-02-23 11:17:14.967 187643 DEBUG nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfmbwyego',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='42b59c9e-2075-4426-aca3-d27c2ca4a97e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 23 11:17:15 compute-0 nova_compute[187639]: 2026-02-23 11:17:15.840 187643 DEBUG oslo_concurrency.processutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:17:15 compute-0 nova_compute[187639]: 2026-02-23 11:17:15.906 187643 DEBUG oslo_concurrency.processutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:17:15 compute-0 nova_compute[187639]: 2026-02-23 11:17:15.908 187643 DEBUG oslo_concurrency.processutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:17:15 compute-0 nova_compute[187639]: 2026-02-23 11:17:15.959 187643 DEBUG oslo_concurrency.processutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:17:16 compute-0 nova_compute[187639]: 2026-02-23 11:17:16.683 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.713 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.714 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.714 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.714 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:17:17 compute-0 sshd-session[216005]: Accepted publickey for nova from 192.168.122.101 port 56494 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.788 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:17:17 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 11:17:17 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 11:17:17 compute-0 systemd-logind[808]: New session 41 of user nova.
Feb 23 11:17:17 compute-0 podman[216008]: 2026-02-23 11:17:17.827313418 +0000 UTC m=+0.068338298 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, managed_by=edpm_ansible, version=9.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, release=1770267347, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 11:17:17 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 11:17:17 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 11:17:17 compute-0 systemd[216031]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.855 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.856 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:17:17 compute-0 nova_compute[187639]: 2026-02-23 11:17:17.898 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:17:17 compute-0 systemd[216031]: Queued start job for default target Main User Target.
Feb 23 11:17:17 compute-0 systemd[216031]: Created slice User Application Slice.
Feb 23 11:17:17 compute-0 systemd[216031]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:17:17 compute-0 systemd[216031]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 11:17:17 compute-0 systemd[216031]: Reached target Paths.
Feb 23 11:17:17 compute-0 systemd[216031]: Reached target Timers.
Feb 23 11:17:17 compute-0 systemd[216031]: Starting D-Bus User Message Bus Socket...
Feb 23 11:17:17 compute-0 systemd[216031]: Starting Create User's Volatile Files and Directories...
Feb 23 11:17:18 compute-0 systemd[216031]: Listening on D-Bus User Message Bus Socket.
Feb 23 11:17:18 compute-0 systemd[216031]: Reached target Sockets.
Feb 23 11:17:18 compute-0 sshd-session[216049]: Invalid user user from 143.198.30.3 port 40566
Feb 23 11:17:18 compute-0 systemd[216031]: Finished Create User's Volatile Files and Directories.
Feb 23 11:17:18 compute-0 systemd[216031]: Reached target Basic System.
Feb 23 11:17:18 compute-0 systemd[216031]: Reached target Main User Target.
Feb 23 11:17:18 compute-0 systemd[216031]: Startup finished in 156ms.
Feb 23 11:17:18 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 11:17:18 compute-0 sshd-session[216049]: Connection closed by invalid user user 143.198.30.3 port 40566 [preauth]
Feb 23 11:17:18 compute-0 systemd[1]: Started Session 41 of User nova.
Feb 23 11:17:18 compute-0 sshd-session[216005]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.050 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.052 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5608MB free_disk=73.1757698059082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.052 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.052 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:18 compute-0 sshd-session[216054]: Received disconnect from 192.168.122.101 port 56494:11: disconnected by user
Feb 23 11:17:18 compute-0 sshd-session[216054]: Disconnected from user nova 192.168.122.101 port 56494
Feb 23 11:17:18 compute-0 sshd-session[216005]: pam_unix(sshd:session): session closed for user nova
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.110 187643 INFO nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updating resource usage from migration 00cda63f-ebaa-421c-95f4-3b176d58ec33
Feb 23 11:17:18 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Feb 23 11:17:18 compute-0 systemd-logind[808]: Session 41 logged out. Waiting for processes to exit.
Feb 23 11:17:18 compute-0 systemd-logind[808]: Removed session 41.
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.141 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Migration 00cda63f-ebaa-421c-95f4-3b176d58ec33 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.141 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.141 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.190 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.210 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.244 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.244 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.844 187643 DEBUG nova.compute.manager [req-64170950-03ed-44e0-990d-3c72de2bbe2c req-66ba9e20-20f6-4925-9526-e3aa25d2cb25 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.844 187643 DEBUG oslo_concurrency.lockutils [req-64170950-03ed-44e0-990d-3c72de2bbe2c req-66ba9e20-20f6-4925-9526-e3aa25d2cb25 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.845 187643 DEBUG oslo_concurrency.lockutils [req-64170950-03ed-44e0-990d-3c72de2bbe2c req-66ba9e20-20f6-4925-9526-e3aa25d2cb25 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.846 187643 DEBUG oslo_concurrency.lockutils [req-64170950-03ed-44e0-990d-3c72de2bbe2c req-66ba9e20-20f6-4925-9526-e3aa25d2cb25 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.846 187643 DEBUG nova.compute.manager [req-64170950-03ed-44e0-990d-3c72de2bbe2c req-66ba9e20-20f6-4925-9526-e3aa25d2cb25 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:18 compute-0 nova_compute[187639]: 2026-02-23 11:17:18.846 187643 DEBUG nova.compute.manager [req-64170950-03ed-44e0-990d-3c72de2bbe2c req-66ba9e20-20f6-4925-9526-e3aa25d2cb25 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.174 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.437 187643 INFO nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Took 3.48 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.438 187643 DEBUG nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:17:19 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:19.447 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:17:19 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:19.447 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.448 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:19 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:19.448 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.480 187643 DEBUG nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfmbwyego',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='42b59c9e-2075-4426-aca3-d27c2ca4a97e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(00cda63f-ebaa-421c-95f4-3b176d58ec33),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.506 187643 DEBUG nova.objects.instance [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid 42b59c9e-2075-4426-aca3-d27c2ca4a97e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.507 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.509 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.509 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.528 187643 DEBUG nova.virt.libvirt.vif [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:16:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-10307845',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-10307845',id=23,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:16:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-rnnmsydt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:16:26Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=42b59c9e-2075-4426-aca3-d27c2ca4a97e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.528 187643 DEBUG nova.network.os_vif_util [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.529 187643 DEBUG nova.network.os_vif_util [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.529 187643 DEBUG nova.virt.libvirt.migration [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updating guest XML with vif config: <interface type="ethernet">
Feb 23 11:17:19 compute-0 nova_compute[187639]:   <mac address="fa:16:3e:5f:3a:4e"/>
Feb 23 11:17:19 compute-0 nova_compute[187639]:   <model type="virtio"/>
Feb 23 11:17:19 compute-0 nova_compute[187639]:   <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:17:19 compute-0 nova_compute[187639]:   <mtu size="1442"/>
Feb 23 11:17:19 compute-0 nova_compute[187639]:   <target dev="tap14d62fb0-c8"/>
Feb 23 11:17:19 compute-0 nova_compute[187639]: </interface>
Feb 23 11:17:19 compute-0 nova_compute[187639]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.530 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 23 11:17:19 compute-0 nova_compute[187639]: 2026-02-23 11:17:19.555 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.013 187643 DEBUG nova.virt.libvirt.migration [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.013 187643 INFO nova.virt.libvirt.migration [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.113 187643 INFO nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.617 187643 DEBUG nova.virt.libvirt.migration [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.617 187643 DEBUG nova.virt.libvirt.migration [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.955 187643 DEBUG nova.compute.manager [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.955 187643 DEBUG oslo_concurrency.lockutils [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.956 187643 DEBUG oslo_concurrency.lockutils [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.956 187643 DEBUG oslo_concurrency.lockutils [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.956 187643 DEBUG nova.compute.manager [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.956 187643 WARNING nova.compute.manager [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received unexpected event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with vm_state active and task_state migrating.
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.957 187643 DEBUG nova.compute.manager [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-changed-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.957 187643 DEBUG nova.compute.manager [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Refreshing instance network info cache due to event network-changed-14d62fb0-c869-4b3c-8914-c1863a5aafe0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.957 187643 DEBUG oslo_concurrency.lockutils [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.957 187643 DEBUG oslo_concurrency.lockutils [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:17:20 compute-0 nova_compute[187639]: 2026-02-23 11:17:20.957 187643 DEBUG nova.network.neutron [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Refreshing network info cache for port 14d62fb0-c869-4b3c-8914-c1863a5aafe0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.083 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845441.0835092, 42b59c9e-2075-4426-aca3-d27c2ca4a97e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.084 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] VM Paused (Lifecycle Event)
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.108 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.110 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.120 187643 DEBUG nova.virt.libvirt.migration [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.120 187643 DEBUG nova.virt.libvirt.migration [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.130 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 23 11:17:21 compute-0 kernel: tap14d62fb0-c8 (unregistering): left promiscuous mode
Feb 23 11:17:21 compute-0 NetworkManager[57207]: <info>  [1771845441.1929] device (tap14d62fb0-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.196 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:21 compute-0 ovn_controller[97601]: 2026-02-23T11:17:21Z|00186|binding|INFO|Releasing lport 14d62fb0-c869-4b3c-8914-c1863a5aafe0 from this chassis (sb_readonly=0)
Feb 23 11:17:21 compute-0 ovn_controller[97601]: 2026-02-23T11:17:21Z|00187|binding|INFO|Setting lport 14d62fb0-c869-4b3c-8914-c1863a5aafe0 down in Southbound
Feb 23 11:17:21 compute-0 ovn_controller[97601]: 2026-02-23T11:17:21Z|00188|binding|INFO|Removing iface tap14d62fb0-c8 ovn-installed in OVS
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.206 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.204 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:3a:4e 10.100.0.8'], port_security=['fa:16:3e:5f:3a:4e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '48738a31-ba59-4fc8-acf1-d1f474e97648'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '42b59c9e-2075-4426-aca3-d27c2ca4a97e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b12da8d-3150-4d44-b948-8d49ddadedef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dfbb0ac693b4065ada17052ebb303dd', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3e488cdc-be4f-476f-950d-ceaccafbc7c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f01bc28e-e7e8-4713-b96b-3656fd233e44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=14d62fb0-c869-4b3c-8914-c1863a5aafe0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.207 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 14d62fb0-c869-4b3c-8914-c1863a5aafe0 in datapath 4b12da8d-3150-4d44-b948-8d49ddadedef unbound from our chassis
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.209 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b12da8d-3150-4d44-b948-8d49ddadedef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.211 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[5f091787-67ec-485d-bcfe-229961b0d34e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.212 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef namespace which is not needed anymore
Feb 23 11:17:21 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Feb 23 11:17:21 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 13.070s CPU time.
Feb 23 11:17:21 compute-0 systemd-machined[156970]: Machine qemu-17-instance-00000017 terminated.
Feb 23 11:17:21 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215802]: [NOTICE]   (215806) : haproxy version is 2.8.14-c23fe91
Feb 23 11:17:21 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215802]: [NOTICE]   (215806) : path to executable is /usr/sbin/haproxy
Feb 23 11:17:21 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215802]: [WARNING]  (215806) : Exiting Master process...
Feb 23 11:17:21 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215802]: [ALERT]    (215806) : Current worker (215808) exited with code 143 (Terminated)
Feb 23 11:17:21 compute-0 neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef[215802]: [WARNING]  (215806) : All workers exited. Exiting... (0)
Feb 23 11:17:21 compute-0 systemd[1]: libpod-143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95.scope: Deactivated successfully.
Feb 23 11:17:21 compute-0 podman[216096]: 2026-02-23 11:17:21.323870011 +0000 UTC m=+0.039572322 container died 143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 11:17:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95-userdata-shm.mount: Deactivated successfully.
Feb 23 11:17:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-c07521d2d4355900015b0828899b0f868d1158c2b82c6ce8b26982d5135d0d03-merged.mount: Deactivated successfully.
Feb 23 11:17:21 compute-0 podman[216096]: 2026-02-23 11:17:21.364911639 +0000 UTC m=+0.080613920 container cleanup 143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 11:17:21 compute-0 systemd[1]: libpod-conmon-143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95.scope: Deactivated successfully.
Feb 23 11:17:21 compute-0 podman[216127]: 2026-02-23 11:17:21.423484519 +0000 UTC m=+0.043050143 container remove 143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.425 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.425 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.425 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.428 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[54062e7b-4d56-44de-9bc4-a46a96d8a7dd]: (4, ('Mon Feb 23 11:17:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95)\n143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95\nMon Feb 23 11:17:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef (143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95)\n143b9b3cb09d2a6b56363801598b9221c663cafb5f19fbb05d6ba0e3472e7d95\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.430 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4eb9a9-1817-41c6-874d-39d4d2e8dd97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.431 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b12da8d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.434 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:21 compute-0 kernel: tap4b12da8d-30: left promiscuous mode
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.440 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.445 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e051cc3c-a9bf-4f19-ad23-90726fe51823]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.458 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3b56a95f-9c27-4fca-8737-8a7738a06d54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.459 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[18caaa37-2f2e-419a-bcda-62f9d920ccf2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.469 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[73799ac1-bb40-4e13-a232-0eba194ecc88]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452991, 'reachable_time': 36456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216162, 'error': None, 'target': 'ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b12da8d\x2d3150\x2d4d44\x2db948\x2d8d49ddadedef.mount: Deactivated successfully.
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.472 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b12da8d-3150-4d44-b948-8d49ddadedef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:17:21 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:17:21.472 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3bab75-a6f9-451c-b35e-94522da41f6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.622 187643 DEBUG nova.virt.libvirt.guest [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '42b59c9e-2075-4426-aca3-d27c2ca4a97e' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.623 187643 INFO nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Migration operation has completed
Feb 23 11:17:21 compute-0 nova_compute[187639]: 2026-02-23 11:17:21.623 187643 INFO nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] _post_live_migration() is started..
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.407 187643 DEBUG nova.network.neutron [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updated VIF entry in instance network info cache for port 14d62fb0-c869-4b3c-8914-c1863a5aafe0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.407 187643 DEBUG nova.network.neutron [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Updating instance_info_cache with network_info: [{"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.442 187643 DEBUG oslo_concurrency.lockutils [req-6e2dd748-ee49-464d-8ff4-d922b0279301 req-61e9410c-6a1d-4e5c-8816-2c21a167862b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-42b59c9e-2075-4426-aca3-d27c2ca4a97e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.470 187643 DEBUG nova.compute.manager [req-9ff1e5c0-68f1-4dcf-a6a3-92505099c34c req-12d67887-511d-4fb3-8570-2fdd2ebcb60f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.471 187643 DEBUG oslo_concurrency.lockutils [req-9ff1e5c0-68f1-4dcf-a6a3-92505099c34c req-12d67887-511d-4fb3-8570-2fdd2ebcb60f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.471 187643 DEBUG oslo_concurrency.lockutils [req-9ff1e5c0-68f1-4dcf-a6a3-92505099c34c req-12d67887-511d-4fb3-8570-2fdd2ebcb60f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.472 187643 DEBUG oslo_concurrency.lockutils [req-9ff1e5c0-68f1-4dcf-a6a3-92505099c34c req-12d67887-511d-4fb3-8570-2fdd2ebcb60f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.472 187643 DEBUG nova.compute.manager [req-9ff1e5c0-68f1-4dcf-a6a3-92505099c34c req-12d67887-511d-4fb3-8570-2fdd2ebcb60f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.473 187643 DEBUG nova.compute.manager [req-9ff1e5c0-68f1-4dcf-a6a3-92505099c34c req-12d67887-511d-4fb3-8570-2fdd2ebcb60f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.799 187643 DEBUG nova.network.neutron [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Activated binding for port 14d62fb0-c869-4b3c-8914-c1863a5aafe0 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.800 187643 DEBUG nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.800 187643 DEBUG nova.virt.libvirt.vif [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:16:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-10307845',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-10307845',id=23,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:16:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5dfbb0ac693b4065ada17052ebb303dd',ramdisk_id='',reservation_id='r-rnnmsydt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-126537390',owner_user_name='tempest-TestExecuteStrategies-126537390-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:17:11Z,user_data=None,user_id='48814d91aad6418f9d55fc9967ed0087',uuid=42b59c9e-2075-4426-aca3-d27c2ca4a97e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.801 187643 DEBUG nova.network.os_vif_util [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "address": "fa:16:3e:5f:3a:4e", "network": {"id": "4b12da8d-3150-4d44-b948-8d49ddadedef", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-2023250868-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5dfbb0ac693b4065ada17052ebb303dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14d62fb0-c8", "ovs_interfaceid": "14d62fb0-c869-4b3c-8914-c1863a5aafe0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.802 187643 DEBUG nova.network.os_vif_util [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.802 187643 DEBUG os_vif [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.804 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.804 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d62fb0-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.806 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.808 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.810 187643 INFO os_vif [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:3a:4e,bridge_name='br-int',has_traffic_filtering=True,id=14d62fb0-c869-4b3c-8914-c1863a5aafe0,network=Network(4b12da8d-3150-4d44-b948-8d49ddadedef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14d62fb0-c8')
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.810 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.811 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.811 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.811 187643 DEBUG nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.812 187643 INFO nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Deleting instance files /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e_del
Feb 23 11:17:22 compute-0 nova_compute[187639]: 2026-02-23 11:17:22.812 187643 INFO nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Deletion of /var/lib/nova/instances/42b59c9e-2075-4426-aca3-d27c2ca4a97e_del complete
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.053 187643 DEBUG nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.054 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.054 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.054 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.055 187643 DEBUG nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.055 187643 DEBUG nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-unplugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.055 187643 DEBUG nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.056 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.056 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.056 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.056 187643 DEBUG nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.057 187643 WARNING nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received unexpected event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with vm_state active and task_state migrating.
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.057 187643 DEBUG nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.057 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.057 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.058 187643 DEBUG oslo_concurrency.lockutils [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.058 187643 DEBUG nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:23 compute-0 nova_compute[187639]: 2026-02-23 11:17:23.058 187643 WARNING nova.compute.manager [req-524e91f0-753e-4e10-a117-ea0f84f4a04d req-fc4befc5-e59b-4585-b264-1422ace849a5 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received unexpected event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with vm_state active and task_state migrating.
Feb 23 11:17:24 compute-0 nova_compute[187639]: 2026-02-23 11:17:24.175 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:24 compute-0 nova_compute[187639]: 2026-02-23 11:17:24.596 187643 DEBUG nova.compute.manager [req-e4181d62-b7ff-4c58-b1ca-b72f1cd994d4 req-d5a4d869-79b5-4c0e-9ee0-fd90ac24d708 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:24 compute-0 nova_compute[187639]: 2026-02-23 11:17:24.596 187643 DEBUG oslo_concurrency.lockutils [req-e4181d62-b7ff-4c58-b1ca-b72f1cd994d4 req-d5a4d869-79b5-4c0e-9ee0-fd90ac24d708 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:24 compute-0 nova_compute[187639]: 2026-02-23 11:17:24.596 187643 DEBUG oslo_concurrency.lockutils [req-e4181d62-b7ff-4c58-b1ca-b72f1cd994d4 req-d5a4d869-79b5-4c0e-9ee0-fd90ac24d708 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:24 compute-0 nova_compute[187639]: 2026-02-23 11:17:24.596 187643 DEBUG oslo_concurrency.lockutils [req-e4181d62-b7ff-4c58-b1ca-b72f1cd994d4 req-d5a4d869-79b5-4c0e-9ee0-fd90ac24d708 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:24 compute-0 nova_compute[187639]: 2026-02-23 11:17:24.597 187643 DEBUG nova.compute.manager [req-e4181d62-b7ff-4c58-b1ca-b72f1cd994d4 req-d5a4d869-79b5-4c0e-9ee0-fd90ac24d708 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:24 compute-0 nova_compute[187639]: 2026-02-23 11:17:24.597 187643 WARNING nova.compute.manager [req-e4181d62-b7ff-4c58-b1ca-b72f1cd994d4 req-d5a4d869-79b5-4c0e-9ee0-fd90ac24d708 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received unexpected event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with vm_state active and task_state migrating.
Feb 23 11:17:25 compute-0 nova_compute[187639]: 2026-02-23 11:17:25.130 187643 DEBUG nova.compute.manager [req-94dc3e24-29a5-485b-856b-0002079ad2a2 req-dd1b3eac-235a-4231-84f6-26e63a42580c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:17:25 compute-0 nova_compute[187639]: 2026-02-23 11:17:25.131 187643 DEBUG oslo_concurrency.lockutils [req-94dc3e24-29a5-485b-856b-0002079ad2a2 req-dd1b3eac-235a-4231-84f6-26e63a42580c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:25 compute-0 nova_compute[187639]: 2026-02-23 11:17:25.131 187643 DEBUG oslo_concurrency.lockutils [req-94dc3e24-29a5-485b-856b-0002079ad2a2 req-dd1b3eac-235a-4231-84f6-26e63a42580c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:25 compute-0 nova_compute[187639]: 2026-02-23 11:17:25.132 187643 DEBUG oslo_concurrency.lockutils [req-94dc3e24-29a5-485b-856b-0002079ad2a2 req-dd1b3eac-235a-4231-84f6-26e63a42580c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:25 compute-0 nova_compute[187639]: 2026-02-23 11:17:25.132 187643 DEBUG nova.compute.manager [req-94dc3e24-29a5-485b-856b-0002079ad2a2 req-dd1b3eac-235a-4231-84f6-26e63a42580c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] No waiting events found dispatching network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:17:25 compute-0 nova_compute[187639]: 2026-02-23 11:17:25.132 187643 WARNING nova.compute.manager [req-94dc3e24-29a5-485b-856b-0002079ad2a2 req-dd1b3eac-235a-4231-84f6-26e63a42580c 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Received unexpected event network-vif-plugged-14d62fb0-c869-4b3c-8914-c1863a5aafe0 for instance with vm_state active and task_state migrating.
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.458 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.460 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.460 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "42b59c9e-2075-4426-aca3-d27c2ca4a97e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.483 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.483 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.484 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.484 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.680 187643 WARNING nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.681 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5776MB free_disk=73.20465469360352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.681 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.681 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.727 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration for instance 42b59c9e-2075-4426-aca3-d27c2ca4a97e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.748 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.793 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration 00cda63f-ebaa-421c-95f4-3b176d58ec33 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.794 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.794 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.808 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.833 187643 DEBUG nova.compute.provider_tree [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.848 187643 DEBUG nova.scheduler.client.report [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.876 187643 DEBUG nova.compute.resource_tracker [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.877 187643 DEBUG oslo_concurrency.lockutils [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:17:27 compute-0 nova_compute[187639]: 2026-02-23 11:17:27.886 187643 INFO nova.compute.manager [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 23 11:17:28 compute-0 nova_compute[187639]: 2026-02-23 11:17:28.022 187643 INFO nova.scheduler.client.report [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Deleted allocation for migration 00cda63f-ebaa-421c-95f4-3b176d58ec33
Feb 23 11:17:28 compute-0 nova_compute[187639]: 2026-02-23 11:17:28.023 187643 DEBUG nova.virt.libvirt.driver [None req-559d9f2d-c6cb-4d9b-961a-1598e33208d9 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 23 11:17:28 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 11:17:28 compute-0 systemd[216031]: Activating special unit Exit the Session...
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped target Main User Target.
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped target Basic System.
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped target Paths.
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped target Sockets.
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped target Timers.
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 11:17:28 compute-0 systemd[216031]: Closed D-Bus User Message Bus Socket.
Feb 23 11:17:28 compute-0 systemd[216031]: Stopped Create User's Volatile Files and Directories.
Feb 23 11:17:28 compute-0 systemd[216031]: Removed slice User Application Slice.
Feb 23 11:17:28 compute-0 systemd[216031]: Reached target Shutdown.
Feb 23 11:17:28 compute-0 systemd[216031]: Finished Exit the Session.
Feb 23 11:17:28 compute-0 systemd[216031]: Reached target Exit the Session.
Feb 23 11:17:28 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 11:17:28 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 11:17:28 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 11:17:28 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 11:17:28 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 11:17:28 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 11:17:28 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 11:17:29 compute-0 nova_compute[187639]: 2026-02-23 11:17:29.177 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:29 compute-0 podman[197002]: time="2026-02-23T11:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:17:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:17:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 23 11:17:31 compute-0 openstack_network_exporter[199919]: ERROR   11:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:17:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:17:31 compute-0 openstack_network_exporter[199919]: ERROR   11:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:17:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:17:31 compute-0 podman[216167]: 2026-02-23 11:17:31.507861469 +0000 UTC m=+0.057892753 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:17:32 compute-0 nova_compute[187639]: 2026-02-23 11:17:32.810 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:34 compute-0 nova_compute[187639]: 2026-02-23 11:17:34.178 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:35 compute-0 podman[216191]: 2026-02-23 11:17:35.865519655 +0000 UTC m=+0.055654094 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 11:17:36 compute-0 nova_compute[187639]: 2026-02-23 11:17:36.424 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845441.4231186, 42b59c9e-2075-4426-aca3-d27c2ca4a97e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:17:36 compute-0 nova_compute[187639]: 2026-02-23 11:17:36.424 187643 INFO nova.compute.manager [-] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] VM Stopped (Lifecycle Event)
Feb 23 11:17:36 compute-0 nova_compute[187639]: 2026-02-23 11:17:36.460 187643 DEBUG nova.compute.manager [None req-794c1eb1-0305-46ff-8326-265b28aa3e46 - - - - - -] [instance: 42b59c9e-2075-4426-aca3-d27c2ca4a97e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:17:37 compute-0 nova_compute[187639]: 2026-02-23 11:17:37.812 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:39 compute-0 nova_compute[187639]: 2026-02-23 11:17:39.221 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:42 compute-0 nova_compute[187639]: 2026-02-23 11:17:42.814 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:44 compute-0 nova_compute[187639]: 2026-02-23 11:17:44.261 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:44 compute-0 podman[216210]: 2026-02-23 11:17:44.895662994 +0000 UTC m=+0.094109954 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 11:17:47 compute-0 sshd-session[216238]: Invalid user admin from 165.227.79.48 port 36724
Feb 23 11:17:47 compute-0 sshd-session[216238]: Connection closed by invalid user admin 165.227.79.48 port 36724 [preauth]
Feb 23 11:17:47 compute-0 nova_compute[187639]: 2026-02-23 11:17:47.817 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:48 compute-0 sshd-session[216241]: error: kex_exchange_identification: read: Connection reset by peer
Feb 23 11:17:48 compute-0 sshd-session[216241]: Connection reset by 176.120.22.52 port 15173
Feb 23 11:17:48 compute-0 podman[216242]: 2026-02-23 11:17:48.851583322 +0000 UTC m=+0.058953611 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 23 11:17:48 compute-0 sshd-session[216264]: Invalid user user from 143.198.30.3 port 36846
Feb 23 11:17:48 compute-0 sshd-session[216264]: Connection closed by invalid user user 143.198.30.3 port 36846 [preauth]
Feb 23 11:17:49 compute-0 nova_compute[187639]: 2026-02-23 11:17:49.263 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:50 compute-0 nova_compute[187639]: 2026-02-23 11:17:50.750 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:17:52 compute-0 nova_compute[187639]: 2026-02-23 11:17:52.820 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:54 compute-0 nova_compute[187639]: 2026-02-23 11:17:54.319 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:57 compute-0 nova_compute[187639]: 2026-02-23 11:17:57.822 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:59 compute-0 nova_compute[187639]: 2026-02-23 11:17:59.320 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:17:59 compute-0 podman[197002]: time="2026-02-23T11:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:17:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:17:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 23 11:18:00 compute-0 nova_compute[187639]: 2026-02-23 11:18:00.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:00 compute-0 nova_compute[187639]: 2026-02-23 11:18:00.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 11:18:00 compute-0 nova_compute[187639]: 2026-02-23 11:18:00.708 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 11:18:01 compute-0 openstack_network_exporter[199919]: ERROR   11:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:18:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:18:01 compute-0 openstack_network_exporter[199919]: ERROR   11:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:18:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:18:01 compute-0 podman[216267]: 2026-02-23 11:18:01.854576313 +0000 UTC m=+0.059236868 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:18:02 compute-0 nova_compute[187639]: 2026-02-23 11:18:02.824 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:03 compute-0 ovn_controller[97601]: 2026-02-23T11:18:03Z|00189|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 23 11:18:04 compute-0 nova_compute[187639]: 2026-02-23 11:18:04.357 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:06 compute-0 nova_compute[187639]: 2026-02-23 11:18:06.708 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:06 compute-0 podman[216292]: 2026-02-23 11:18:06.885957728 +0000 UTC m=+0.080597460 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 23 11:18:07 compute-0 nova_compute[187639]: 2026-02-23 11:18:07.826 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:09 compute-0 nova_compute[187639]: 2026-02-23 11:18:09.359 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:10 compute-0 nova_compute[187639]: 2026-02-23 11:18:10.004 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:10 compute-0 nova_compute[187639]: 2026-02-23 11:18:10.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:10 compute-0 nova_compute[187639]: 2026-02-23 11:18:10.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:11 compute-0 nova_compute[187639]: 2026-02-23 11:18:11.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:11 compute-0 nova_compute[187639]: 2026-02-23 11:18:11.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:18:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:18:12.664 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:18:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:18:12.665 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:18:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:18:12.665 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:18:12 compute-0 nova_compute[187639]: 2026-02-23 11:18:12.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:12 compute-0 nova_compute[187639]: 2026-02-23 11:18:12.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:18:12 compute-0 nova_compute[187639]: 2026-02-23 11:18:12.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:18:12 compute-0 nova_compute[187639]: 2026-02-23 11:18:12.710 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:18:12 compute-0 nova_compute[187639]: 2026-02-23 11:18:12.829 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:13 compute-0 nova_compute[187639]: 2026-02-23 11:18:13.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:14 compute-0 nova_compute[187639]: 2026-02-23 11:18:14.398 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:15 compute-0 nova_compute[187639]: 2026-02-23 11:18:15.687 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:15 compute-0 podman[216312]: 2026-02-23 11:18:15.915530772 +0000 UTC m=+0.105543175 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.43.0)
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.733 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.733 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.733 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.733 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.831 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.932 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.935 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5825MB free_disk=73.20543670654297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.936 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:18:17 compute-0 nova_compute[187639]: 2026-02-23 11:18:17.936 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:18:18 compute-0 nova_compute[187639]: 2026-02-23 11:18:18.304 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:18:18 compute-0 nova_compute[187639]: 2026-02-23 11:18:18.305 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:18:18 compute-0 nova_compute[187639]: 2026-02-23 11:18:18.337 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:18:18 compute-0 nova_compute[187639]: 2026-02-23 11:18:18.361 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:18:18 compute-0 nova_compute[187639]: 2026-02-23 11:18:18.362 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:18:18 compute-0 nova_compute[187639]: 2026-02-23 11:18:18.363 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:18:19 compute-0 nova_compute[187639]: 2026-02-23 11:18:19.437 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:19 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:18:19.488 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:18:19 compute-0 nova_compute[187639]: 2026-02-23 11:18:19.489 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:19 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:18:19.490 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:18:19 compute-0 podman[216340]: 2026-02-23 11:18:19.870749179 +0000 UTC m=+0.070567015 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 11:18:21 compute-0 sshd-session[216361]: Invalid user user from 143.198.30.3 port 49928
Feb 23 11:18:21 compute-0 sshd-session[216361]: Connection closed by invalid user user 143.198.30.3 port 49928 [preauth]
Feb 23 11:18:22 compute-0 nova_compute[187639]: 2026-02-23 11:18:22.834 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:24 compute-0 nova_compute[187639]: 2026-02-23 11:18:24.479 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:27 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:18:27.492 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:18:27 compute-0 nova_compute[187639]: 2026-02-23 11:18:27.837 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:28 compute-0 nova_compute[187639]: 2026-02-23 11:18:28.359 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:29 compute-0 nova_compute[187639]: 2026-02-23 11:18:29.526 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:29 compute-0 podman[197002]: time="2026-02-23T11:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:18:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:18:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 23 11:18:30 compute-0 nova_compute[187639]: 2026-02-23 11:18:30.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:30 compute-0 nova_compute[187639]: 2026-02-23 11:18:30.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 11:18:31 compute-0 openstack_network_exporter[199919]: ERROR   11:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:18:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:18:31 compute-0 openstack_network_exporter[199919]: ERROR   11:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:18:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:18:32 compute-0 sshd-session[216363]: Invalid user admin from 165.227.79.48 port 48312
Feb 23 11:18:32 compute-0 sshd-session[216363]: Connection closed by invalid user admin 165.227.79.48 port 48312 [preauth]
Feb 23 11:18:32 compute-0 podman[216365]: 2026-02-23 11:18:32.236978445 +0000 UTC m=+0.129176417 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:18:32 compute-0 nova_compute[187639]: 2026-02-23 11:18:32.839 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:34 compute-0 nova_compute[187639]: 2026-02-23 11:18:34.570 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:35 compute-0 nova_compute[187639]: 2026-02-23 11:18:35.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:35 compute-0 nova_compute[187639]: 2026-02-23 11:18:35.746 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:18:37 compute-0 podman[216389]: 2026-02-23 11:18:37.8348775 +0000 UTC m=+0.042987951 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Feb 23 11:18:37 compute-0 nova_compute[187639]: 2026-02-23 11:18:37.842 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:39 compute-0 nova_compute[187639]: 2026-02-23 11:18:39.573 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:42 compute-0 nova_compute[187639]: 2026-02-23 11:18:42.844 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:43 compute-0 ovn_controller[97601]: 2026-02-23T11:18:43Z|00190|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 23 11:18:44 compute-0 nova_compute[187639]: 2026-02-23 11:18:44.574 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:46 compute-0 podman[216408]: 2026-02-23 11:18:46.929175155 +0000 UTC m=+0.130293926 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 11:18:47 compute-0 nova_compute[187639]: 2026-02-23 11:18:47.846 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:49 compute-0 nova_compute[187639]: 2026-02-23 11:18:49.616 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:50 compute-0 podman[216434]: 2026-02-23 11:18:50.834577685 +0000 UTC m=+0.042312224 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter)
Feb 23 11:18:52 compute-0 nova_compute[187639]: 2026-02-23 11:18:52.848 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:54 compute-0 sshd-session[216456]: Invalid user user from 143.198.30.3 port 58726
Feb 23 11:18:54 compute-0 sshd-session[216456]: Connection closed by invalid user user 143.198.30.3 port 58726 [preauth]
Feb 23 11:18:54 compute-0 nova_compute[187639]: 2026-02-23 11:18:54.618 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:57 compute-0 nova_compute[187639]: 2026-02-23 11:18:57.850 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:59 compute-0 nova_compute[187639]: 2026-02-23 11:18:59.619 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:18:59 compute-0 podman[197002]: time="2026-02-23T11:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:18:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:18:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 23 11:19:01 compute-0 openstack_network_exporter[199919]: ERROR   11:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:19:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:19:01 compute-0 openstack_network_exporter[199919]: ERROR   11:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:19:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.529 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.529 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.543 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.619 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.619 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.625 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.626 187643 INFO nova.compute.claims [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.815 187643 DEBUG nova.compute.provider_tree [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.835 187643 DEBUG nova.scheduler.client.report [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.866 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.867 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.939 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.940 187643 DEBUG nova.network.neutron [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.966 187643 INFO nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:19:01 compute-0 nova_compute[187639]: 2026-02-23 11:19:01.989 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.113 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.115 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.115 187643 INFO nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Creating image(s)
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.116 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquiring lock "/var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.117 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "/var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.118 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "/var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.143 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.186 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.187 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.188 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.202 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.245 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.246 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.272 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.273 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.273 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.333 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.334 187643 DEBUG nova.virt.disk.api [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Checking if we can resize image /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.334 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.387 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.389 187643 DEBUG nova.virt.disk.api [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Cannot resize image /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.389 187643 DEBUG nova.objects.instance [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lazy-loading 'migration_context' on Instance uuid d6fe1241-58d7-4610-81f3-a3f8564f59d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.519 187643 DEBUG nova.policy [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2f9eed11e5b346abaed95c13131885fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd4e819326be4b09ab11d74172cd7171', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.852 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:02 compute-0 podman[216474]: 2026-02-23 11:19:02.859366403 +0000 UTC m=+0.062304689 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.987 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.988 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Ensure instance console log exists: /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.988 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.988 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:02 compute-0 nova_compute[187639]: 2026-02-23 11:19:02.989 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:03 compute-0 nova_compute[187639]: 2026-02-23 11:19:03.611 187643 DEBUG nova.network.neutron [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Successfully created port: a82f383e-fe79-4df7-ba9e-233aabc0ab23 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.620 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.699 187643 DEBUG nova.network.neutron [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Successfully updated port: a82f383e-fe79-4df7-ba9e-233aabc0ab23 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.738 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquiring lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.738 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquired lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.739 187643 DEBUG nova.network.neutron [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.801 187643 DEBUG nova.compute.manager [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-changed-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.802 187643 DEBUG nova.compute.manager [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Refreshing instance network info cache due to event network-changed-a82f383e-fe79-4df7-ba9e-233aabc0ab23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.802 187643 DEBUG oslo_concurrency.lockutils [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:19:04 compute-0 nova_compute[187639]: 2026-02-23 11:19:04.874 187643 DEBUG nova.network.neutron [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.639 187643 DEBUG nova.network.neutron [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updating instance_info_cache with network_info: [{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.660 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Releasing lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.660 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Instance network_info: |[{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.661 187643 DEBUG oslo_concurrency.lockutils [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.661 187643 DEBUG nova.network.neutron [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Refreshing network info cache for port a82f383e-fe79-4df7-ba9e-233aabc0ab23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.666 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Start _get_guest_xml network_info=[{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.671 187643 WARNING nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.677 187643 DEBUG nova.virt.libvirt.host [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.678 187643 DEBUG nova.virt.libvirt.host [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.686 187643 DEBUG nova.virt.libvirt.host [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.687 187643 DEBUG nova.virt.libvirt.host [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.689 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.690 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.690 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.691 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.691 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.691 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.691 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.691 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.692 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.692 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.692 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.692 187643 DEBUG nova.virt.hardware [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.696 187643 DEBUG nova.virt.libvirt.vif [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:19:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2083996683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2083996683',id=26,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd4e819326be4b09ab11d74172cd7171',ramdisk_id='',reservation_id='r-0i2f2ox4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:19:02Z,user_data=None,user_id='2f9eed11e5b346abaed95c13131885fd',uuid=d6fe1241-58d7-4610-81f3-a3f8564f59d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.697 187643 DEBUG nova.network.os_vif_util [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Converting VIF {"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.697 187643 DEBUG nova.network.os_vif_util [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.698 187643 DEBUG nova.objects.instance [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lazy-loading 'pci_devices' on Instance uuid d6fe1241-58d7-4610-81f3-a3f8564f59d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.713 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <uuid>d6fe1241-58d7-4610-81f3-a3f8564f59d4</uuid>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <name>instance-0000001a</name>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-2083996683</nova:name>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:19:05</nova:creationTime>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:user uuid="2f9eed11e5b346abaed95c13131885fd">tempest-TestExecuteVmWorkloadBalanceStrategy-683269409-project-member</nova:user>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:project uuid="dd4e819326be4b09ab11d74172cd7171">tempest-TestExecuteVmWorkloadBalanceStrategy-683269409</nova:project>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         <nova:port uuid="a82f383e-fe79-4df7-ba9e-233aabc0ab23">
Feb 23 11:19:05 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <system>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <entry name="serial">d6fe1241-58d7-4610-81f3-a3f8564f59d4</entry>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <entry name="uuid">d6fe1241-58d7-4610-81f3-a3f8564f59d4</entry>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </system>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <os>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   </os>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <features>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   </features>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk.config"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:64:5b:8a"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <target dev="tapa82f383e-fe"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/console.log" append="off"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <video>
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </video>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:19:05 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:19:05 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:19:05 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:19:05 compute-0 nova_compute[187639]: </domain>
Feb 23 11:19:05 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.714 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Preparing to wait for external event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.714 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.714 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.714 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.715 187643 DEBUG nova.virt.libvirt.vif [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:19:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2083996683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2083996683',id=26,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd4e819326be4b09ab11d74172cd7171',ramdisk_id='',reservation_id='r-0i2f2ox4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:19:02Z,user_data=None,user_id='2f9eed11e5b346abaed95c13131885fd',uuid=d6fe1241-58d7-4610-81f3-a3f8564f59d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.715 187643 DEBUG nova.network.os_vif_util [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Converting VIF {"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.716 187643 DEBUG nova.network.os_vif_util [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.716 187643 DEBUG os_vif [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.717 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.717 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.717 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.720 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.720 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa82f383e-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.720 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa82f383e-fe, col_values=(('external_ids', {'iface-id': 'a82f383e-fe79-4df7-ba9e-233aabc0ab23', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:5b:8a', 'vm-uuid': 'd6fe1241-58d7-4610-81f3-a3f8564f59d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.722 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:05 compute-0 NetworkManager[57207]: <info>  [1771845545.7230] manager: (tapa82f383e-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.725 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.727 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.728 187643 INFO os_vif [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe')
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.773 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.774 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.774 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] No VIF found with MAC fa:16:3e:64:5b:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:19:05 compute-0 nova_compute[187639]: 2026-02-23 11:19:05.775 187643 INFO nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Using config drive
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.242 187643 INFO nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Creating config drive at /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk.config
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.249 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp6zzy102k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.373 187643 DEBUG oslo_concurrency.processutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp6zzy102k" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:06 compute-0 kernel: tapa82f383e-fe: entered promiscuous mode
Feb 23 11:19:06 compute-0 NetworkManager[57207]: <info>  [1771845546.4345] manager: (tapa82f383e-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Feb 23 11:19:06 compute-0 ovn_controller[97601]: 2026-02-23T11:19:06Z|00191|binding|INFO|Claiming lport a82f383e-fe79-4df7-ba9e-233aabc0ab23 for this chassis.
Feb 23 11:19:06 compute-0 ovn_controller[97601]: 2026-02-23T11:19:06Z|00192|binding|INFO|a82f383e-fe79-4df7-ba9e-233aabc0ab23: Claiming fa:16:3e:64:5b:8a 10.100.0.3
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.436 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.444 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.461 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:5b:8a 10.100.0.3'], port_security=['fa:16:3e:64:5b:8a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd6fe1241-58d7-4610-81f3-a3f8564f59d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd4e819326be4b09ab11d74172cd7171', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07be435b-6a75-4c74-9fb3-d89755cbb1b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc54eace-e37f-4ce1-923a-73b41965c6cd, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=a82f383e-fe79-4df7-ba9e-233aabc0ab23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.463 106968 INFO neutron.agent.ovn.metadata.agent [-] Port a82f383e-fe79-4df7-ba9e-233aabc0ab23 in datapath c6d43b50-54ec-4f28-9f0b-90af6069e6cc bound to our chassis
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.466 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c6d43b50-54ec-4f28-9f0b-90af6069e6cc
Feb 23 11:19:06 compute-0 systemd-machined[156970]: New machine qemu-18-instance-0000001a.
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.480 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d18cc9-8622-4a4f-83b1-fe5dfa12f49d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.481 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc6d43b50-51 in ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:19:06 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000001a.
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.484 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc6d43b50-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.484 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d4379bbe-5fbf-4c8a-b1dc-8deb24fab4ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_controller[97601]: 2026-02-23T11:19:06Z|00193|binding|INFO|Setting lport a82f383e-fe79-4df7-ba9e-233aabc0ab23 ovn-installed in OVS
Feb 23 11:19:06 compute-0 ovn_controller[97601]: 2026-02-23T11:19:06Z|00194|binding|INFO|Setting lport a82f383e-fe79-4df7-ba9e-233aabc0ab23 up in Southbound
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.486 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1793ee74-8787-4f02-97c5-83b56a6fa2d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.487 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.497 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e987be-ee23-4eed-9e69-a9e669698d63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 systemd-udevd[216521]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:19:06 compute-0 NetworkManager[57207]: <info>  [1771845546.5127] device (tapa82f383e-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:19:06 compute-0 NetworkManager[57207]: <info>  [1771845546.5140] device (tapa82f383e-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.515 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[aca5be74-3b97-4305-86c9-27ce54c62616]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.545 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[5f483e07-1e04-4586-8084-a2a4e4baf7ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.551 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[69181167-40f9-4e36-a513-75499f0db817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 NetworkManager[57207]: <info>  [1771845546.5526] manager: (tapc6d43b50-50): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.575 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[59ea0b4f-5cd4-456e-8713-9c4a3b958a5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.580 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[925bb613-6b0c-44bc-bb3f-d402d7793abb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 NetworkManager[57207]: <info>  [1771845546.6018] device (tapc6d43b50-50): carrier: link connected
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.607 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcfba06-fc38-4935-be3a-16b4d300b534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.627 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[4a3af18a-2e25-42cb-acaa-39bee867c818]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc6d43b50-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:c1:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469076, 'reachable_time': 44461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216552, 'error': None, 'target': 'ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.640 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[dafd5464-f4cf-4208-99d1-3bb6f755cd8b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:c194'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469076, 'tstamp': 469076}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216553, 'error': None, 'target': 'ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.664 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8d6add2e-7ef2-4956-92a6-62ea6c33fec2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc6d43b50-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:c1:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469076, 'reachable_time': 44461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216554, 'error': None, 'target': 'ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.687 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d30c1b27-684b-4acb-a2ab-61661651295c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.730 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a770fc93-7e78-4a6b-82cf-aa91cf92436e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.732 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6d43b50-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.732 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.733 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6d43b50-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:19:06 compute-0 kernel: tapc6d43b50-50: entered promiscuous mode
Feb 23 11:19:06 compute-0 NetworkManager[57207]: <info>  [1771845546.7734] manager: (tapc6d43b50-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.776 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc6d43b50-50, col_values=(('external_ids', {'iface-id': '993ece47-221a-46dd-90d0-f6ed9b979b34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.777 187643 DEBUG nova.compute.manager [req-a1a6a129-3199-4aee-bd8a-5de7243d5264 req-d2cf08db-7538-4e52-b131-5f8d784e0abf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:19:06 compute-0 ovn_controller[97601]: 2026-02-23T11:19:06Z|00195|binding|INFO|Releasing lport 993ece47-221a-46dd-90d0-f6ed9b979b34 from this chassis (sb_readonly=0)
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.777 187643 DEBUG oslo_concurrency.lockutils [req-a1a6a129-3199-4aee-bd8a-5de7243d5264 req-d2cf08db-7538-4e52-b131-5f8d784e0abf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.778 187643 DEBUG oslo_concurrency.lockutils [req-a1a6a129-3199-4aee-bd8a-5de7243d5264 req-d2cf08db-7538-4e52-b131-5f8d784e0abf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.778 187643 DEBUG oslo_concurrency.lockutils [req-a1a6a129-3199-4aee-bd8a-5de7243d5264 req-d2cf08db-7538-4e52-b131-5f8d784e0abf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.779 187643 DEBUG nova.compute.manager [req-a1a6a129-3199-4aee-bd8a-5de7243d5264 req-d2cf08db-7538-4e52-b131-5f8d784e0abf 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Processing event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.779 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.781 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c6d43b50-54ec-4f28-9f0b-90af6069e6cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c6d43b50-54ec-4f28-9f0b-90af6069e6cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.784 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2db48e-ff4b-48f3-9049-adc07dfaa398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.785 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-c6d43b50-54ec-4f28-9f0b-90af6069e6cc
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/c6d43b50-54ec-4f28-9f0b-90af6069e6cc.pid.haproxy
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID c6d43b50-54ec-4f28-9f0b-90af6069e6cc
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:19:06 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:06.786 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'env', 'PROCESS_TAG=haproxy-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c6d43b50-54ec-4f28-9f0b-90af6069e6cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.787 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.976 187643 DEBUG nova.network.neutron [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updated VIF entry in instance network info cache for port a82f383e-fe79-4df7-ba9e-233aabc0ab23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.977 187643 DEBUG nova.network.neutron [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updating instance_info_cache with network_info: [{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:19:06 compute-0 nova_compute[187639]: 2026-02-23 11:19:06.995 187643 DEBUG oslo_concurrency.lockutils [req-27d1e4e0-cb7e-4ae6-92cf-b7df2dbe85c0 req-00703cbe-57dd-415c-9a79-c9ac439aa8f2 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:19:07 compute-0 podman[216587]: 2026-02-23 11:19:07.114274429 +0000 UTC m=+0.046962025 container create b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 11:19:07 compute-0 systemd[1]: Started libpod-conmon-b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba.scope.
Feb 23 11:19:07 compute-0 podman[216587]: 2026-02-23 11:19:07.088975475 +0000 UTC m=+0.021663161 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:19:07 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:19:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9d5eda75c27737e880e213fbb71c5ccffd268b06d00e087e95f25b09f55a22b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:19:07 compute-0 podman[216587]: 2026-02-23 11:19:07.212956443 +0000 UTC m=+0.145644049 container init b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:19:07 compute-0 podman[216587]: 2026-02-23 11:19:07.217276327 +0000 UTC m=+0.149963913 container start b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216)
Feb 23 11:19:07 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [NOTICE]   (216606) : New worker (216608) forked
Feb 23 11:19:07 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [NOTICE]   (216606) : Loading success.
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.234 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845548.233456, d6fe1241-58d7-4610-81f3-a3f8564f59d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.236 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] VM Started (Lifecycle Event)
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.239 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.244 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.250 187643 INFO nova.virt.libvirt.driver [-] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Instance spawned successfully.
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.251 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.256 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.260 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.269 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.269 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.269 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.270 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.270 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.270 187643 DEBUG nova.virt.libvirt.driver [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.291 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.292 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845548.2336493, d6fe1241-58d7-4610-81f3-a3f8564f59d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.292 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] VM Paused (Lifecycle Event)
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.313 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.317 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845548.2431917, d6fe1241-58d7-4610-81f3-a3f8564f59d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.317 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] VM Resumed (Lifecycle Event)
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.330 187643 INFO nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Took 6.22 seconds to spawn the instance on the hypervisor.
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.331 187643 DEBUG nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.374 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.378 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.426 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.441 187643 INFO nova.compute.manager [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Took 6.85 seconds to build instance.
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.462 187643 DEBUG oslo_concurrency.lockutils [None req-b780f395-ca23-4cc8-9f6d-b83c5544bfad 2f9eed11e5b346abaed95c13131885fd dd4e819326be4b09ab11d74172cd7171 - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.713 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.890 187643 DEBUG nova.compute.manager [req-910e295a-1023-4fdd-bc09-182ca289b764 req-879c5679-600f-456a-9f78-8fdb5ca76973 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.891 187643 DEBUG oslo_concurrency.lockutils [req-910e295a-1023-4fdd-bc09-182ca289b764 req-879c5679-600f-456a-9f78-8fdb5ca76973 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.891 187643 DEBUG oslo_concurrency.lockutils [req-910e295a-1023-4fdd-bc09-182ca289b764 req-879c5679-600f-456a-9f78-8fdb5ca76973 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.892 187643 DEBUG oslo_concurrency.lockutils [req-910e295a-1023-4fdd-bc09-182ca289b764 req-879c5679-600f-456a-9f78-8fdb5ca76973 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.892 187643 DEBUG nova.compute.manager [req-910e295a-1023-4fdd-bc09-182ca289b764 req-879c5679-600f-456a-9f78-8fdb5ca76973 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:19:08 compute-0 nova_compute[187639]: 2026-02-23 11:19:08.892 187643 WARNING nova.compute.manager [req-910e295a-1023-4fdd-bc09-182ca289b764 req-879c5679-600f-456a-9f78-8fdb5ca76973 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received unexpected event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with vm_state active and task_state None.
Feb 23 11:19:08 compute-0 podman[216624]: 2026-02-23 11:19:08.896423102 +0000 UTC m=+0.091518286 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:19:09 compute-0 nova_compute[187639]: 2026-02-23 11:19:09.626 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:10 compute-0 nova_compute[187639]: 2026-02-23 11:19:10.762 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:11 compute-0 nova_compute[187639]: 2026-02-23 11:19:11.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:11 compute-0 nova_compute[187639]: 2026-02-23 11:19:11.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:19:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:12.664 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:12.666 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:19:12.667 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:12 compute-0 nova_compute[187639]: 2026-02-23 11:19:12.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:12 compute-0 nova_compute[187639]: 2026-02-23 11:19:12.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:19:12 compute-0 nova_compute[187639]: 2026-02-23 11:19:12.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:19:12 compute-0 nova_compute[187639]: 2026-02-23 11:19:12.866 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:19:12 compute-0 nova_compute[187639]: 2026-02-23 11:19:12.867 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:19:12 compute-0 nova_compute[187639]: 2026-02-23 11:19:12.867 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:19:12 compute-0 nova_compute[187639]: 2026-02-23 11:19:12.868 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid d6fe1241-58d7-4610-81f3-a3f8564f59d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:19:14 compute-0 nova_compute[187639]: 2026-02-23 11:19:14.627 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:14 compute-0 nova_compute[187639]: 2026-02-23 11:19:14.754 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updating instance_info_cache with network_info: [{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:19:14 compute-0 nova_compute[187639]: 2026-02-23 11:19:14.772 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:19:14 compute-0 nova_compute[187639]: 2026-02-23 11:19:14.773 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:19:14 compute-0 nova_compute[187639]: 2026-02-23 11:19:14.773 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:14 compute-0 nova_compute[187639]: 2026-02-23 11:19:14.774 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:14 compute-0 nova_compute[187639]: 2026-02-23 11:19:14.774 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:15 compute-0 nova_compute[187639]: 2026-02-23 11:19:15.805 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:17 compute-0 nova_compute[187639]: 2026-02-23 11:19:17.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:17 compute-0 nova_compute[187639]: 2026-02-23 11:19:17.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:17 compute-0 podman[216644]: 2026-02-23 11:19:17.888255363 +0000 UTC m=+0.084858272 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:19:19 compute-0 nova_compute[187639]: 2026-02-23 11:19:19.626 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:19 compute-0 nova_compute[187639]: 2026-02-23 11:19:19.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:19:20 compute-0 ovn_controller[97601]: 2026-02-23T11:19:20Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:64:5b:8a 10.100.0.3
Feb 23 11:19:20 compute-0 ovn_controller[97601]: 2026-02-23T11:19:20Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:64:5b:8a 10.100.0.3
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.171 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.172 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.172 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.173 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.242 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.313 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.315 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.357 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.508 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.509 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5641MB free_disk=73.17572021484375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.509 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.509 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.624 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance d6fe1241-58d7-4610-81f3-a3f8564f59d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.624 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.624 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.666 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.679 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.699 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.699 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:19:20 compute-0 nova_compute[187639]: 2026-02-23 11:19:20.807 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:21 compute-0 sshd-session[216695]: Invalid user admin from 165.227.79.48 port 35140
Feb 23 11:19:21 compute-0 sshd-session[216695]: Connection closed by invalid user admin 165.227.79.48 port 35140 [preauth]
Feb 23 11:19:21 compute-0 podman[216697]: 2026-02-23 11:19:21.468514496 +0000 UTC m=+0.069103717 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-type=git, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64)
Feb 23 11:19:24 compute-0 nova_compute[187639]: 2026-02-23 11:19:24.629 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:25 compute-0 nova_compute[187639]: 2026-02-23 11:19:25.809 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:26 compute-0 sshd-session[216718]: Invalid user user from 143.198.30.3 port 43990
Feb 23 11:19:26 compute-0 sshd-session[216718]: Connection closed by invalid user user 143.198.30.3 port 43990 [preauth]
Feb 23 11:19:29 compute-0 nova_compute[187639]: 2026-02-23 11:19:29.630 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:29 compute-0 podman[197002]: time="2026-02-23T11:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:19:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:19:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2642 "" "Go-http-client/1.1"
Feb 23 11:19:30 compute-0 nova_compute[187639]: 2026-02-23 11:19:30.812 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:31 compute-0 openstack_network_exporter[199919]: ERROR   11:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:19:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:19:31 compute-0 openstack_network_exporter[199919]: ERROR   11:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:19:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:19:33 compute-0 podman[216720]: 2026-02-23 11:19:33.886440618 +0000 UTC m=+0.087686756 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:19:34 compute-0 nova_compute[187639]: 2026-02-23 11:19:34.634 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:35 compute-0 nova_compute[187639]: 2026-02-23 11:19:35.869 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:36 compute-0 ovn_controller[97601]: 2026-02-23T11:19:36Z|00196|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Feb 23 11:19:39 compute-0 nova_compute[187639]: 2026-02-23 11:19:39.637 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:39 compute-0 podman[216746]: 2026-02-23 11:19:39.839471677 +0000 UTC m=+0.045930799 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 23 11:19:40 compute-0 nova_compute[187639]: 2026-02-23 11:19:40.871 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:44 compute-0 nova_compute[187639]: 2026-02-23 11:19:44.638 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:45 compute-0 nova_compute[187639]: 2026-02-23 11:19:45.873 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:48 compute-0 podman[216768]: 2026-02-23 11:19:48.86476685 +0000 UTC m=+0.066529460 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 11:19:49 compute-0 nova_compute[187639]: 2026-02-23 11:19:49.640 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:50 compute-0 nova_compute[187639]: 2026-02-23 11:19:50.875 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:51 compute-0 podman[216794]: 2026-02-23 11:19:51.973782917 +0000 UTC m=+0.166000354 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1770267347, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Feb 23 11:19:54 compute-0 nova_compute[187639]: 2026-02-23 11:19:54.641 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:55 compute-0 nova_compute[187639]: 2026-02-23 11:19:55.877 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:57 compute-0 sshd-session[216817]: Invalid user user from 143.198.30.3 port 47472
Feb 23 11:19:57 compute-0 sshd-session[216817]: Connection closed by invalid user user 143.198.30.3 port 47472 [preauth]
Feb 23 11:19:59 compute-0 nova_compute[187639]: 2026-02-23 11:19:59.643 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:19:59 compute-0 podman[197002]: time="2026-02-23T11:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:19:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:19:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 23 11:20:00 compute-0 nova_compute[187639]: 2026-02-23 11:20:00.880 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:01 compute-0 openstack_network_exporter[199919]: ERROR   11:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:20:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:20:01 compute-0 openstack_network_exporter[199919]: ERROR   11:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:20:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:20:04 compute-0 sshd-session[216819]: Invalid user admin from 165.227.79.48 port 50826
Feb 23 11:20:04 compute-0 sshd-session[216819]: Connection closed by invalid user admin 165.227.79.48 port 50826 [preauth]
Feb 23 11:20:04 compute-0 podman[216821]: 2026-02-23 11:20:04.296410937 +0000 UTC m=+0.045553763 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:20:04 compute-0 nova_compute[187639]: 2026-02-23 11:20:04.644 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:05 compute-0 nova_compute[187639]: 2026-02-23 11:20:05.882 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:09 compute-0 nova_compute[187639]: 2026-02-23 11:20:09.645 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:10 compute-0 nova_compute[187639]: 2026-02-23 11:20:10.700 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:10 compute-0 podman[216845]: 2026-02-23 11:20:10.847450287 +0000 UTC m=+0.053218615 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216)
Feb 23 11:20:10 compute-0 nova_compute[187639]: 2026-02-23 11:20:10.884 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:12.665 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:12.666 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:12.666 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:12 compute-0 nova_compute[187639]: 2026-02-23 11:20:12.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:13 compute-0 nova_compute[187639]: 2026-02-23 11:20:13.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:13 compute-0 nova_compute[187639]: 2026-02-23 11:20:13.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:20:14 compute-0 nova_compute[187639]: 2026-02-23 11:20:14.646 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:14 compute-0 nova_compute[187639]: 2026-02-23 11:20:14.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:14 compute-0 nova_compute[187639]: 2026-02-23 11:20:14.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:20:14 compute-0 nova_compute[187639]: 2026-02-23 11:20:14.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:20:15 compute-0 nova_compute[187639]: 2026-02-23 11:20:15.464 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:20:15 compute-0 nova_compute[187639]: 2026-02-23 11:20:15.464 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:20:15 compute-0 nova_compute[187639]: 2026-02-23 11:20:15.465 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:20:15 compute-0 nova_compute[187639]: 2026-02-23 11:20:15.465 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid d6fe1241-58d7-4610-81f3-a3f8564f59d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:20:15 compute-0 nova_compute[187639]: 2026-02-23 11:20:15.886 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:16 compute-0 nova_compute[187639]: 2026-02-23 11:20:16.835 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updating instance_info_cache with network_info: [{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:20:16 compute-0 nova_compute[187639]: 2026-02-23 11:20:16.859 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:20:16 compute-0 nova_compute[187639]: 2026-02-23 11:20:16.859 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:20:16 compute-0 nova_compute[187639]: 2026-02-23 11:20:16.860 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:16 compute-0 nova_compute[187639]: 2026-02-23 11:20:16.861 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:19 compute-0 nova_compute[187639]: 2026-02-23 11:20:19.650 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:19 compute-0 nova_compute[187639]: 2026-02-23 11:20:19.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:19 compute-0 nova_compute[187639]: 2026-02-23 11:20:19.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:19 compute-0 podman[216865]: 2026-02-23 11:20:19.888755919 +0000 UTC m=+0.087417288 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:20:20 compute-0 nova_compute[187639]: 2026-02-23 11:20:20.889 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.724 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.725 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.725 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.725 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.781 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.844 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.844 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:20:21 compute-0 nova_compute[187639]: 2026-02-23 11:20:21.897 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.053 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.054 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5671MB free_disk=73.1757583618164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.055 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.055 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.131 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance d6fe1241-58d7-4610-81f3-a3f8564f59d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.132 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.132 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.180 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.199 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.200 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.201 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.281 187643 DEBUG nova.compute.manager [None req-3d369935-458b-42c7-bb3d-2353724384d6 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 23 11:20:22 compute-0 nova_compute[187639]: 2026-02-23 11:20:22.335 187643 DEBUG nova.compute.provider_tree [None req-3d369935-458b-42c7-bb3d-2353724384d6 a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 33 to 38 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 11:20:22 compute-0 podman[216911]: 2026-02-23 11:20:22.868333903 +0000 UTC m=+0.057499048 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Feb 23 11:20:24 compute-0 nova_compute[187639]: 2026-02-23 11:20:24.651 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:25 compute-0 nova_compute[187639]: 2026-02-23 11:20:25.891 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:26 compute-0 nova_compute[187639]: 2026-02-23 11:20:26.842 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Check if temp file /var/lib/nova/instances/tmpc8ont4hg exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 23 11:20:26 compute-0 nova_compute[187639]: 2026-02-23 11:20:26.843 187643 DEBUG nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc8ont4hg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d6fe1241-58d7-4610-81f3-a3f8564f59d4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 23 11:20:27 compute-0 nova_compute[187639]: 2026-02-23 11:20:27.604 187643 DEBUG oslo_concurrency.processutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:20:27 compute-0 nova_compute[187639]: 2026-02-23 11:20:27.649 187643 DEBUG oslo_concurrency.processutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:20:27 compute-0 nova_compute[187639]: 2026-02-23 11:20:27.650 187643 DEBUG oslo_concurrency.processutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:20:27 compute-0 nova_compute[187639]: 2026-02-23 11:20:27.695 187643 DEBUG oslo_concurrency.processutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:20:28 compute-0 sshd-session[216938]: Invalid user user from 143.198.30.3 port 60126
Feb 23 11:20:28 compute-0 sshd-session[216938]: Connection closed by invalid user user 143.198.30.3 port 60126 [preauth]
Feb 23 11:20:29 compute-0 sshd-session[216940]: Accepted publickey for nova from 192.168.122.101 port 38012 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 11:20:29 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 11:20:29 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 11:20:29 compute-0 systemd-logind[808]: New session 43 of user nova.
Feb 23 11:20:29 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 11:20:29 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 11:20:29 compute-0 systemd[216944]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:20:29 compute-0 systemd[216944]: Queued start job for default target Main User Target.
Feb 23 11:20:29 compute-0 systemd[216944]: Created slice User Application Slice.
Feb 23 11:20:29 compute-0 systemd[216944]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:20:29 compute-0 systemd[216944]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 11:20:29 compute-0 systemd[216944]: Reached target Paths.
Feb 23 11:20:29 compute-0 systemd[216944]: Reached target Timers.
Feb 23 11:20:29 compute-0 systemd[216944]: Starting D-Bus User Message Bus Socket...
Feb 23 11:20:29 compute-0 systemd[216944]: Starting Create User's Volatile Files and Directories...
Feb 23 11:20:29 compute-0 systemd[216944]: Listening on D-Bus User Message Bus Socket.
Feb 23 11:20:29 compute-0 systemd[216944]: Reached target Sockets.
Feb 23 11:20:29 compute-0 nova_compute[187639]: 2026-02-23 11:20:29.653 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:29 compute-0 systemd[216944]: Finished Create User's Volatile Files and Directories.
Feb 23 11:20:29 compute-0 systemd[216944]: Reached target Basic System.
Feb 23 11:20:29 compute-0 systemd[216944]: Reached target Main User Target.
Feb 23 11:20:29 compute-0 systemd[216944]: Startup finished in 125ms.
Feb 23 11:20:29 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 11:20:29 compute-0 systemd[1]: Started Session 43 of User nova.
Feb 23 11:20:29 compute-0 sshd-session[216940]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:20:29 compute-0 sshd-session[216960]: Received disconnect from 192.168.122.101 port 38012:11: disconnected by user
Feb 23 11:20:29 compute-0 sshd-session[216960]: Disconnected from user nova 192.168.122.101 port 38012
Feb 23 11:20:29 compute-0 sshd-session[216940]: pam_unix(sshd:session): session closed for user nova
Feb 23 11:20:29 compute-0 podman[197002]: time="2026-02-23T11:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:20:29 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Feb 23 11:20:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:20:29 compute-0 systemd-logind[808]: Session 43 logged out. Waiting for processes to exit.
Feb 23 11:20:29 compute-0 systemd-logind[808]: Removed session 43.
Feb 23 11:20:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2645 "" "Go-http-client/1.1"
Feb 23 11:20:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:30.861 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.862 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:30.863 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.878 187643 DEBUG nova.compute.manager [req-b2977cc5-c154-45c1-8f94-65a157dbe9e7 req-2a789899-e4fa-4237-ace4-c4e5007433de 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.880 187643 DEBUG oslo_concurrency.lockutils [req-b2977cc5-c154-45c1-8f94-65a157dbe9e7 req-2a789899-e4fa-4237-ace4-c4e5007433de 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.880 187643 DEBUG oslo_concurrency.lockutils [req-b2977cc5-c154-45c1-8f94-65a157dbe9e7 req-2a789899-e4fa-4237-ace4-c4e5007433de 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.881 187643 DEBUG oslo_concurrency.lockutils [req-b2977cc5-c154-45c1-8f94-65a157dbe9e7 req-2a789899-e4fa-4237-ace4-c4e5007433de 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.881 187643 DEBUG nova.compute.manager [req-b2977cc5-c154-45c1-8f94-65a157dbe9e7 req-2a789899-e4fa-4237-ace4-c4e5007433de 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.882 187643 DEBUG nova.compute.manager [req-b2977cc5-c154-45c1-8f94-65a157dbe9e7 req-2a789899-e4fa-4237-ace4-c4e5007433de 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:20:30 compute-0 nova_compute[187639]: 2026-02-23 11:20:30.892 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.198 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.393 187643 INFO nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Took 3.70 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.394 187643 DEBUG nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.410 187643 DEBUG nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc8ont4hg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d6fe1241-58d7-4610-81f3-a3f8564f59d4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d15a2bbd-f3a4-46f4-a5c9-12732188441a),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 23 11:20:31 compute-0 openstack_network_exporter[199919]: ERROR   11:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:20:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:20:31 compute-0 openstack_network_exporter[199919]: ERROR   11:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:20:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.431 187643 DEBUG nova.objects.instance [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid d6fe1241-58d7-4610-81f3-a3f8564f59d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.433 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.435 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.435 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.450 187643 DEBUG nova.virt.libvirt.vif [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:19:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2083996683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2083996683',id=26,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:19:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dd4e819326be4b09ab11d74172cd7171',ramdisk_id='',reservation_id='r-0i2f2ox4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:19:08Z,user_data=None,user_id='2f9eed11e5b346abaed95c13131885fd',uuid=d6fe1241-58d7-4610-81f3-a3f8564f59d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.450 187643 DEBUG nova.network.os_vif_util [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.451 187643 DEBUG nova.network.os_vif_util [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.451 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updating guest XML with vif config: <interface type="ethernet">
Feb 23 11:20:31 compute-0 nova_compute[187639]:   <mac address="fa:16:3e:64:5b:8a"/>
Feb 23 11:20:31 compute-0 nova_compute[187639]:   <model type="virtio"/>
Feb 23 11:20:31 compute-0 nova_compute[187639]:   <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:20:31 compute-0 nova_compute[187639]:   <mtu size="1442"/>
Feb 23 11:20:31 compute-0 nova_compute[187639]:   <target dev="tapa82f383e-fe"/>
Feb 23 11:20:31 compute-0 nova_compute[187639]: </interface>
Feb 23 11:20:31 compute-0 nova_compute[187639]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.452 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.937 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:20:31 compute-0 nova_compute[187639]: 2026-02-23 11:20:31.937 187643 INFO nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.033 187643 INFO nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.536 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.536 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.975 187643 DEBUG nova.compute.manager [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.976 187643 DEBUG oslo_concurrency.lockutils [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.976 187643 DEBUG oslo_concurrency.lockutils [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.976 187643 DEBUG oslo_concurrency.lockutils [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.977 187643 DEBUG nova.compute.manager [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.977 187643 WARNING nova.compute.manager [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received unexpected event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with vm_state active and task_state migrating.
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.977 187643 DEBUG nova.compute.manager [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-changed-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.978 187643 DEBUG nova.compute.manager [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Refreshing instance network info cache due to event network-changed-a82f383e-fe79-4df7-ba9e-233aabc0ab23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.978 187643 DEBUG oslo_concurrency.lockutils [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.978 187643 DEBUG oslo_concurrency.lockutils [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:20:32 compute-0 nova_compute[187639]: 2026-02-23 11:20:32.978 187643 DEBUG nova.network.neutron [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Refreshing network info cache for port a82f383e-fe79-4df7-ba9e-233aabc0ab23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.039 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.039 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.543 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.544 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.844 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845633.844304, d6fe1241-58d7-4610-81f3-a3f8564f59d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.845 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] VM Paused (Lifecycle Event)
Feb 23 11:20:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:33.866 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.868 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.872 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:20:33 compute-0 nova_compute[187639]: 2026-02-23 11:20:33.888 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.046 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.047 187643 DEBUG nova.virt.libvirt.migration [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:20:34 compute-0 kernel: tapa82f383e-fe (unregistering): left promiscuous mode
Feb 23 11:20:34 compute-0 NetworkManager[57207]: <info>  [1771845634.1832] device (tapa82f383e-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:20:34 compute-0 ovn_controller[97601]: 2026-02-23T11:20:34Z|00197|binding|INFO|Releasing lport a82f383e-fe79-4df7-ba9e-233aabc0ab23 from this chassis (sb_readonly=0)
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.187 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 ovn_controller[97601]: 2026-02-23T11:20:34Z|00198|binding|INFO|Setting lport a82f383e-fe79-4df7-ba9e-233aabc0ab23 down in Southbound
Feb 23 11:20:34 compute-0 ovn_controller[97601]: 2026-02-23T11:20:34Z|00199|binding|INFO|Removing iface tapa82f383e-fe ovn-installed in OVS
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.190 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.194 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.197 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:5b:8a 10.100.0.3'], port_security=['fa:16:3e:64:5b:8a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '48738a31-ba59-4fc8-acf1-d1f474e97648'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd6fe1241-58d7-4610-81f3-a3f8564f59d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd4e819326be4b09ab11d74172cd7171', 'neutron:revision_number': '8', 'neutron:security_group_ids': '07be435b-6a75-4c74-9fb3-d89755cbb1b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc54eace-e37f-4ce1-923a-73b41965c6cd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=a82f383e-fe79-4df7-ba9e-233aabc0ab23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.198 106968 INFO neutron.agent.ovn.metadata.agent [-] Port a82f383e-fe79-4df7-ba9e-233aabc0ab23 in datapath c6d43b50-54ec-4f28-9f0b-90af6069e6cc unbound from our chassis
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.199 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c6d43b50-54ec-4f28-9f0b-90af6069e6cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.201 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f0b57b-df65-45a8-958d-954bb0c3ebfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.202 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc namespace which is not needed anymore
Feb 23 11:20:34 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Feb 23 11:20:34 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Consumed 16.023s CPU time.
Feb 23 11:20:34 compute-0 systemd-machined[156970]: Machine qemu-18-instance-0000001a terminated.
Feb 23 11:20:34 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [NOTICE]   (216606) : haproxy version is 2.8.14-c23fe91
Feb 23 11:20:34 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [NOTICE]   (216606) : path to executable is /usr/sbin/haproxy
Feb 23 11:20:34 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [WARNING]  (216606) : Exiting Master process...
Feb 23 11:20:34 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [WARNING]  (216606) : Exiting Master process...
Feb 23 11:20:34 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [ALERT]    (216606) : Current worker (216608) exited with code 143 (Terminated)
Feb 23 11:20:34 compute-0 neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc[216602]: [WARNING]  (216606) : All workers exited. Exiting... (0)
Feb 23 11:20:34 compute-0 systemd[1]: libpod-b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba.scope: Deactivated successfully.
Feb 23 11:20:34 compute-0 podman[216995]: 2026-02-23 11:20:34.306388001 +0000 UTC m=+0.037968293 container died b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 11:20:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba-userdata-shm.mount: Deactivated successfully.
Feb 23 11:20:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-a9d5eda75c27737e880e213fbb71c5ccffd268b06d00e087e95f25b09f55a22b-merged.mount: Deactivated successfully.
Feb 23 11:20:34 compute-0 podman[216995]: 2026-02-23 11:20:34.360975182 +0000 UTC m=+0.092555474 container cleanup b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 11:20:34 compute-0 systemd[1]: libpod-conmon-b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba.scope: Deactivated successfully.
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.387 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.392 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.415 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.416 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.416 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 23 11:20:34 compute-0 podman[217034]: 2026-02-23 11:20:34.416784765 +0000 UTC m=+0.043269343 container remove b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.419 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[671c1a91-c65a-4b35-bd84-b43dfb59f1ed]: (4, ('Mon Feb 23 11:20:34 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc (b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba)\nb5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba\nMon Feb 23 11:20:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc (b5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba)\nb5156ebf112a309238b6d21617581072f97ab9efd507d0f682be7d5e547115ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.421 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d47790fb-a2ce-43c7-ab90-2604078fcd5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.421 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6d43b50-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:20:34 compute-0 podman[217023]: 2026-02-23 11:20:34.460164609 +0000 UTC m=+0.105919896 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.459 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 kernel: tapc6d43b50-50: left promiscuous mode
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.468 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.470 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4c3fc4-557e-4573-a06a-2958db452199]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.486 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ef799f-b453-4f27-beaf-e9ec2f6816a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.487 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[06528b39-3f91-41ae-b9e0-d72158a2b52a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.497 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d68a7243-efcb-4016-aec2-b27ccdb3df86]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469070, 'reachable_time': 35243, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217082, 'error': None, 'target': 'ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.498 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c6d43b50-54ec-4f28-9f0b-90af6069e6cc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:20:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:20:34.499 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[d617b189-db56-43b3-8bc5-6121f56b084e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:20:34 compute-0 systemd[1]: run-netns-ovnmeta\x2dc6d43b50\x2d54ec\x2d4f28\x2d9f0b\x2d90af6069e6cc.mount: Deactivated successfully.
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.549 187643 DEBUG nova.virt.libvirt.guest [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'd6fe1241-58d7-4610-81f3-a3f8564f59d4' (instance-0000001a) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.550 187643 INFO nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Migration operation has completed
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.550 187643 INFO nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] _post_live_migration() is started..
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.654 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.718 187643 DEBUG nova.compute.manager [req-c0d4b665-d297-4f1b-89de-8eacb6e60c7d req-f8a6d7c6-cbe6-45fe-8698-79ca20f09875 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.719 187643 DEBUG oslo_concurrency.lockutils [req-c0d4b665-d297-4f1b-89de-8eacb6e60c7d req-f8a6d7c6-cbe6-45fe-8698-79ca20f09875 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.719 187643 DEBUG oslo_concurrency.lockutils [req-c0d4b665-d297-4f1b-89de-8eacb6e60c7d req-f8a6d7c6-cbe6-45fe-8698-79ca20f09875 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.719 187643 DEBUG oslo_concurrency.lockutils [req-c0d4b665-d297-4f1b-89de-8eacb6e60c7d req-f8a6d7c6-cbe6-45fe-8698-79ca20f09875 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.719 187643 DEBUG nova.compute.manager [req-c0d4b665-d297-4f1b-89de-8eacb6e60c7d req-f8a6d7c6-cbe6-45fe-8698-79ca20f09875 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.719 187643 DEBUG nova.compute.manager [req-c0d4b665-d297-4f1b-89de-8eacb6e60c7d req-f8a6d7c6-cbe6-45fe-8698-79ca20f09875 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.721 187643 DEBUG nova.network.neutron [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updated VIF entry in instance network info cache for port a82f383e-fe79-4df7-ba9e-233aabc0ab23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.721 187643 DEBUG nova.network.neutron [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Updating instance_info_cache with network_info: [{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:20:34 compute-0 nova_compute[187639]: 2026-02-23 11:20:34.740 187643 DEBUG oslo_concurrency.lockutils [req-8c8a8058-22fe-452d-a388-980e32db2b9b req-376daa5a-4ed3-430b-af70-45dc2fa92d74 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-d6fe1241-58d7-4610-81f3-a3f8564f59d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.622 187643 DEBUG nova.network.neutron [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Activated binding for port a82f383e-fe79-4df7-ba9e-233aabc0ab23 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.622 187643 DEBUG nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.623 187643 DEBUG nova.virt.libvirt.vif [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:19:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2083996683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2083996683',id=26,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:19:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dd4e819326be4b09ab11d74172cd7171',ramdisk_id='',reservation_id='r-0i2f2ox4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-683269409-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:20:24Z,user_data=None,user_id='2f9eed11e5b346abaed95c13131885fd',uuid=d6fe1241-58d7-4610-81f3-a3f8564f59d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.623 187643 DEBUG nova.network.os_vif_util [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "address": "fa:16:3e:64:5b:8a", "network": {"id": "c6d43b50-54ec-4f28-9f0b-90af6069e6cc", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1402943732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd4e819326be4b09ab11d74172cd7171", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa82f383e-fe", "ovs_interfaceid": "a82f383e-fe79-4df7-ba9e-233aabc0ab23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.624 187643 DEBUG nova.network.os_vif_util [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.624 187643 DEBUG os_vif [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.626 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.626 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa82f383e-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.628 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.629 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.632 187643 INFO os_vif [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:5b:8a,bridge_name='br-int',has_traffic_filtering=True,id=a82f383e-fe79-4df7-ba9e-233aabc0ab23,network=Network(c6d43b50-54ec-4f28-9f0b-90af6069e6cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa82f383e-fe')
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.632 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.632 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.633 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.633 187643 DEBUG nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.633 187643 INFO nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Deleting instance files /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4_del
Feb 23 11:20:35 compute-0 nova_compute[187639]: 2026-02-23 11:20:35.634 187643 INFO nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Deletion of /var/lib/nova/instances/d6fe1241-58d7-4610-81f3-a3f8564f59d4_del complete
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.814 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.815 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.815 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.815 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.815 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.816 187643 WARNING nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received unexpected event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with vm_state active and task_state migrating.
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.816 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.816 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.816 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.817 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.817 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.817 187643 WARNING nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received unexpected event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with vm_state active and task_state migrating.
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.817 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.817 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.817 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.818 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.818 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.818 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-unplugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.818 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.818 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.819 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.819 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.819 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.819 187643 WARNING nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received unexpected event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with vm_state active and task_state migrating.
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.819 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.819 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.820 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.820 187643 DEBUG oslo_concurrency.lockutils [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.820 187643 DEBUG nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] No waiting events found dispatching network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:20:36 compute-0 nova_compute[187639]: 2026-02-23 11:20:36.820 187643 WARNING nova.compute.manager [req-51616779-c225-4cbe-a664-bae99267575b req-c2d5f88a-ff55-4748-82cf-257263ac6906 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Received unexpected event network-vif-plugged-a82f383e-fe79-4df7-ba9e-233aabc0ab23 for instance with vm_state active and task_state migrating.
Feb 23 11:20:39 compute-0 nova_compute[187639]: 2026-02-23 11:20:39.656 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:39 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 11:20:39 compute-0 systemd[216944]: Activating special unit Exit the Session...
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped target Main User Target.
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped target Basic System.
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped target Paths.
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped target Sockets.
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped target Timers.
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 11:20:39 compute-0 systemd[216944]: Closed D-Bus User Message Bus Socket.
Feb 23 11:20:39 compute-0 systemd[216944]: Stopped Create User's Volatile Files and Directories.
Feb 23 11:20:39 compute-0 systemd[216944]: Removed slice User Application Slice.
Feb 23 11:20:39 compute-0 systemd[216944]: Reached target Shutdown.
Feb 23 11:20:39 compute-0 systemd[216944]: Finished Exit the Session.
Feb 23 11:20:39 compute-0 systemd[216944]: Reached target Exit the Session.
Feb 23 11:20:39 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 11:20:39 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 11:20:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 11:20:39 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 11:20:39 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 11:20:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 11:20:39 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 11:20:40 compute-0 nova_compute[187639]: 2026-02-23 11:20:40.662 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.290 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.291 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.291 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "d6fe1241-58d7-4610-81f3-a3f8564f59d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.311 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.312 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.312 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.313 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:20:41 compute-0 podman[217086]: 2026-02-23 11:20:41.41887809 +0000 UTC m=+0.058412503 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.441 187643 WARNING nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.442 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5792MB free_disk=73.20448684692383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.443 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.443 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.481 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration for instance d6fe1241-58d7-4610-81f3-a3f8564f59d4 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.501 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.541 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration d15a2bbd-f3a4-46f4-a5c9-12732188441a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.541 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.541 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.589 187643 DEBUG nova.compute.provider_tree [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.605 187643 DEBUG nova.scheduler.client.report [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.631 187643 DEBUG nova.compute.resource_tracker [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.631 187643 DEBUG oslo_concurrency.lockutils [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.635 187643 INFO nova.compute.manager [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.716 187643 INFO nova.scheduler.client.report [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Deleted allocation for migration d15a2bbd-f3a4-46f4-a5c9-12732188441a
Feb 23 11:20:41 compute-0 nova_compute[187639]: 2026-02-23 11:20:41.716 187643 DEBUG nova.virt.libvirt.driver [None req-8245739d-de18-4a00-ab9b-dca385f056ad a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 23 11:20:44 compute-0 nova_compute[187639]: 2026-02-23 11:20:44.658 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:45 compute-0 nova_compute[187639]: 2026-02-23 11:20:45.665 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:46 compute-0 sshd-session[217105]: Invalid user admin from 165.227.79.48 port 46586
Feb 23 11:20:46 compute-0 sshd-session[217105]: Connection closed by invalid user admin 165.227.79.48 port 46586 [preauth]
Feb 23 11:20:49 compute-0 nova_compute[187639]: 2026-02-23 11:20:49.415 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845634.4138823, d6fe1241-58d7-4610-81f3-a3f8564f59d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:20:49 compute-0 nova_compute[187639]: 2026-02-23 11:20:49.416 187643 INFO nova.compute.manager [-] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] VM Stopped (Lifecycle Event)
Feb 23 11:20:49 compute-0 nova_compute[187639]: 2026-02-23 11:20:49.433 187643 DEBUG nova.compute.manager [None req-8c416919-6dde-4478-9c4a-ffa6c61f5b4d - - - - - -] [instance: d6fe1241-58d7-4610-81f3-a3f8564f59d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:20:49 compute-0 nova_compute[187639]: 2026-02-23 11:20:49.659 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:50 compute-0 nova_compute[187639]: 2026-02-23 11:20:50.666 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:51 compute-0 podman[217107]: 2026-02-23 11:20:51.179898285 +0000 UTC m=+0.084851261 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 23 11:20:53 compute-0 podman[217133]: 2026-02-23 11:20:53.875393092 +0000 UTC m=+0.075227736 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 11:20:54 compute-0 nova_compute[187639]: 2026-02-23 11:20:54.660 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:55 compute-0 nova_compute[187639]: 2026-02-23 11:20:55.669 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:58 compute-0 sshd-session[217156]: Invalid user user from 143.198.30.3 port 35492
Feb 23 11:20:58 compute-0 sshd-session[217156]: Connection closed by invalid user user 143.198.30.3 port 35492 [preauth]
Feb 23 11:20:59 compute-0 nova_compute[187639]: 2026-02-23 11:20:59.662 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:20:59 compute-0 podman[197002]: time="2026-02-23T11:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:20:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:20:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 23 11:21:00 compute-0 nova_compute[187639]: 2026-02-23 11:21:00.717 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:01 compute-0 openstack_network_exporter[199919]: ERROR   11:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:21:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:21:01 compute-0 openstack_network_exporter[199919]: ERROR   11:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:21:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:21:04 compute-0 nova_compute[187639]: 2026-02-23 11:21:04.663 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:04 compute-0 nova_compute[187639]: 2026-02-23 11:21:04.759 187643 DEBUG nova.compute.manager [None req-cc470c28-232f-4491-8975-cd6e047e1b04 d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 23 11:21:04 compute-0 nova_compute[187639]: 2026-02-23 11:21:04.803 187643 DEBUG nova.compute.provider_tree [None req-cc470c28-232f-4491-8975-cd6e047e1b04 d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Updating resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 generation from 38 to 41 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 11:21:04 compute-0 podman[217158]: 2026-02-23 11:21:04.862450544 +0000 UTC m=+0.059329947 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:21:05 compute-0 nova_compute[187639]: 2026-02-23 11:21:05.750 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:08 compute-0 nova_compute[187639]: 2026-02-23 11:21:08.696 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:09 compute-0 nova_compute[187639]: 2026-02-23 11:21:09.666 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:09 compute-0 nova_compute[187639]: 2026-02-23 11:21:09.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:10 compute-0 nova_compute[187639]: 2026-02-23 11:21:10.784 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:11 compute-0 podman[217183]: 2026-02-23 11:21:11.854339387 +0000 UTC m=+0.053014270 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 11:21:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:12.666 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:12.666 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:12.666 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:12 compute-0 nova_compute[187639]: 2026-02-23 11:21:12.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:14 compute-0 nova_compute[187639]: 2026-02-23 11:21:14.669 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.690 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.708 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.708 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.708 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.708 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:21:15 compute-0 nova_compute[187639]: 2026-02-23 11:21:15.821 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:16 compute-0 nova_compute[187639]: 2026-02-23 11:21:16.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:18 compute-0 nova_compute[187639]: 2026-02-23 11:21:18.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:19 compute-0 nova_compute[187639]: 2026-02-23 11:21:19.712 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:20 compute-0 nova_compute[187639]: 2026-02-23 11:21:20.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:20 compute-0 nova_compute[187639]: 2026-02-23 11:21:20.823 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:21 compute-0 podman[217202]: 2026-02-23 11:21:21.953354145 +0000 UTC m=+0.141159936 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.725 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.726 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.726 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.727 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.906 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.907 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5806MB free_disk=73.20450592041016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.907 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.907 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.986 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:21:22 compute-0 nova_compute[187639]: 2026-02-23 11:21:22.986 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.003 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.027 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.027 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.044 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.068 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.101 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.117 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.120 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:21:23 compute-0 nova_compute[187639]: 2026-02-23 11:21:23.120 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:24 compute-0 nova_compute[187639]: 2026-02-23 11:21:24.714 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:24 compute-0 podman[217228]: 2026-02-23 11:21:24.832353276 +0000 UTC m=+0.036706090 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 23 11:21:25 compute-0 nova_compute[187639]: 2026-02-23 11:21:25.889 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:28 compute-0 sshd-session[217249]: Invalid user user from 143.198.30.3 port 50242
Feb 23 11:21:28 compute-0 sshd-session[217249]: Connection closed by invalid user user 143.198.30.3 port 50242 [preauth]
Feb 23 11:21:29 compute-0 nova_compute[187639]: 2026-02-23 11:21:29.716 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:29 compute-0 podman[197002]: time="2026-02-23T11:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:21:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:21:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 23 11:21:30 compute-0 nova_compute[187639]: 2026-02-23 11:21:30.891 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:31 compute-0 openstack_network_exporter[199919]: ERROR   11:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:21:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:21:31 compute-0 openstack_network_exporter[199919]: ERROR   11:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:21:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:21:31 compute-0 sshd-session[217251]: Invalid user admin from 165.227.79.48 port 52000
Feb 23 11:21:31 compute-0 sshd-session[217251]: Connection closed by invalid user admin 165.227.79.48 port 52000 [preauth]
Feb 23 11:21:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:33.725 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:21:33 compute-0 nova_compute[187639]: 2026-02-23 11:21:33.726 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:33.726 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:21:33 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:33.728 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:21:34 compute-0 nova_compute[187639]: 2026-02-23 11:21:34.718 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:35 compute-0 podman[217253]: 2026-02-23 11:21:35.851445374 +0000 UTC m=+0.051821979 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 11:21:35 compute-0 nova_compute[187639]: 2026-02-23 11:21:35.932 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:39 compute-0 nova_compute[187639]: 2026-02-23 11:21:39.739 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.459 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.459 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.486 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.587 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.588 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.598 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.599 187643 INFO nova.compute.claims [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.725 187643 DEBUG nova.compute.provider_tree [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.754 187643 DEBUG nova.scheduler.client.report [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.802 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.803 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.872 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.873 187643 DEBUG nova.network.neutron [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.894 187643 INFO nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.927 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:21:40 compute-0 nova_compute[187639]: 2026-02-23 11:21:40.970 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.018 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.019 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.020 187643 INFO nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Creating image(s)
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.020 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "/var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.020 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "/var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.021 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "/var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.039 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.123 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.124 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.124 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.140 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.200 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.201 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.236 187643 DEBUG nova.policy [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1cc69df7c7464f81ae1446f3587ebd7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f82b70ddbd84b29baad3bb3a8bc340d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.238 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.239 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.239 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.300 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.302 187643 DEBUG nova.virt.disk.api [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Checking if we can resize image /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.303 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.360 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.362 187643 DEBUG nova.virt.disk.api [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Cannot resize image /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.363 187643 DEBUG nova.objects.instance [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lazy-loading 'migration_context' on Instance uuid 1e192505-df0b-49ed-8cf3-a77e144e8ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.392 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.393 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Ensure instance console log exists: /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.393 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.394 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.395 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:41 compute-0 nova_compute[187639]: 2026-02-23 11:21:41.739 187643 DEBUG nova.network.neutron [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Successfully created port: 36da1c62-1081-4041-ac32-4925d6a4ecb8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:21:42 compute-0 podman[217292]: 2026-02-23 11:21:42.858825017 +0000 UTC m=+0.063044925 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 11:21:42 compute-0 nova_compute[187639]: 2026-02-23 11:21:42.958 187643 DEBUG nova.network.neutron [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Successfully updated port: 36da1c62-1081-4041-ac32-4925d6a4ecb8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:21:42 compute-0 nova_compute[187639]: 2026-02-23 11:21:42.972 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:21:42 compute-0 nova_compute[187639]: 2026-02-23 11:21:42.973 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquired lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:21:42 compute-0 nova_compute[187639]: 2026-02-23 11:21:42.973 187643 DEBUG nova.network.neutron [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:21:43 compute-0 nova_compute[187639]: 2026-02-23 11:21:43.101 187643 DEBUG nova.compute.manager [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received event network-changed-36da1c62-1081-4041-ac32-4925d6a4ecb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:21:43 compute-0 nova_compute[187639]: 2026-02-23 11:21:43.101 187643 DEBUG nova.compute.manager [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Refreshing instance network info cache due to event network-changed-36da1c62-1081-4041-ac32-4925d6a4ecb8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:21:43 compute-0 nova_compute[187639]: 2026-02-23 11:21:43.102 187643 DEBUG oslo_concurrency.lockutils [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:21:43 compute-0 ovn_controller[97601]: 2026-02-23T11:21:43Z|00200|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 11:21:43 compute-0 nova_compute[187639]: 2026-02-23 11:21:43.522 187643 DEBUG nova.network.neutron [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.566 187643 DEBUG nova.network.neutron [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Updating instance_info_cache with network_info: [{"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.647 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Releasing lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.647 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Instance network_info: |[{"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.648 187643 DEBUG oslo_concurrency.lockutils [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.648 187643 DEBUG nova.network.neutron [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Refreshing network info cache for port 36da1c62-1081-4041-ac32-4925d6a4ecb8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.651 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Start _get_guest_xml network_info=[{"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.655 187643 WARNING nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.661 187643 DEBUG nova.virt.libvirt.host [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.661 187643 DEBUG nova.virt.libvirt.host [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.674 187643 DEBUG nova.virt.libvirt.host [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.676 187643 DEBUG nova.virt.libvirt.host [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.677 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.677 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.677 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.678 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.678 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.678 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.678 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.679 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.679 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.679 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.679 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.680 187643 DEBUG nova.virt.hardware [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.684 187643 DEBUG nova.virt.libvirt.vif [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:21:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1004673487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1004673487',id=27,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f82b70ddbd84b29baad3bb3a8bc340d',ramdisk_id='',reservation_id='r-0zu0clyc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:21:40Z,user_data=None,user_id='1cc69df7c7464f81ae1446f3587ebd7e',uuid=1e192505-df0b-49ed-8cf3-a77e144e8ab3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.685 187643 DEBUG nova.network.os_vif_util [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converting VIF {"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.685 187643 DEBUG nova.network.os_vif_util [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:57:0e,bridge_name='br-int',has_traffic_filtering=True,id=36da1c62-1081-4041-ac32-4925d6a4ecb8,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36da1c62-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.686 187643 DEBUG nova.objects.instance [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e192505-df0b-49ed-8cf3-a77e144e8ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.726 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <uuid>1e192505-df0b-49ed-8cf3-a77e144e8ab3</uuid>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <name>instance-0000001b</name>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-1004673487</nova:name>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:21:44</nova:creationTime>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:user uuid="1cc69df7c7464f81ae1446f3587ebd7e">tempest-TestExecuteWorkloadBalancingStrategy-63810276-project-member</nova:user>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:project uuid="9f82b70ddbd84b29baad3bb3a8bc340d">tempest-TestExecuteWorkloadBalancingStrategy-63810276</nova:project>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         <nova:port uuid="36da1c62-1081-4041-ac32-4925d6a4ecb8">
Feb 23 11:21:44 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <system>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <entry name="serial">1e192505-df0b-49ed-8cf3-a77e144e8ab3</entry>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <entry name="uuid">1e192505-df0b-49ed-8cf3-a77e144e8ab3</entry>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </system>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <os>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   </os>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <features>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   </features>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk.config"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:45:57:0e"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <target dev="tap36da1c62-10"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/console.log" append="off"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <video>
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </video>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:21:44 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:21:44 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:21:44 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:21:44 compute-0 nova_compute[187639]: </domain>
Feb 23 11:21:44 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.727 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Preparing to wait for external event network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.728 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.728 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.728 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.729 187643 DEBUG nova.virt.libvirt.vif [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:21:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1004673487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1004673487',id=27,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f82b70ddbd84b29baad3bb3a8bc340d',ramdisk_id='',reservation_id='r-0zu0clyc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:21:40Z,user_data=None,user_id='1cc69df7c7464f81ae1446f3587ebd7e',uuid=1e192505-df0b-49ed-8cf3-a77e144e8ab3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.729 187643 DEBUG nova.network.os_vif_util [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converting VIF {"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.730 187643 DEBUG nova.network.os_vif_util [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:57:0e,bridge_name='br-int',has_traffic_filtering=True,id=36da1c62-1081-4041-ac32-4925d6a4ecb8,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36da1c62-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.730 187643 DEBUG os_vif [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:57:0e,bridge_name='br-int',has_traffic_filtering=True,id=36da1c62-1081-4041-ac32-4925d6a4ecb8,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36da1c62-10') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.730 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.731 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.731 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.734 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.734 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36da1c62-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.735 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36da1c62-10, col_values=(('external_ids', {'iface-id': '36da1c62-1081-4041-ac32-4925d6a4ecb8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:57:0e', 'vm-uuid': '1e192505-df0b-49ed-8cf3-a77e144e8ab3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.736 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:44 compute-0 NetworkManager[57207]: <info>  [1771845704.7370] manager: (tap36da1c62-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.738 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.743 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.744 187643 INFO os_vif [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:57:0e,bridge_name='br-int',has_traffic_filtering=True,id=36da1c62-1081-4041-ac32-4925d6a4ecb8,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36da1c62-10')
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.745 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.948 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.949 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.949 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] No VIF found with MAC fa:16:3e:45:57:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:21:44 compute-0 nova_compute[187639]: 2026-02-23 11:21:44.950 187643 INFO nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Using config drive
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.563 187643 INFO nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Creating config drive at /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk.config
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.566 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpryluapht execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.690 187643 DEBUG oslo_concurrency.processutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpryluapht" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:21:45 compute-0 kernel: tap36da1c62-10: entered promiscuous mode
Feb 23 11:21:45 compute-0 NetworkManager[57207]: <info>  [1771845705.7399] manager: (tap36da1c62-10): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.740 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:45 compute-0 ovn_controller[97601]: 2026-02-23T11:21:45Z|00201|binding|INFO|Claiming lport 36da1c62-1081-4041-ac32-4925d6a4ecb8 for this chassis.
Feb 23 11:21:45 compute-0 ovn_controller[97601]: 2026-02-23T11:21:45Z|00202|binding|INFO|36da1c62-1081-4041-ac32-4925d6a4ecb8: Claiming fa:16:3e:45:57:0e 10.100.0.8
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.742 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.744 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.748 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.759 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:57:0e 10.100.0.8'], port_security=['fa:16:3e:45:57:0e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e192505-df0b-49ed-8cf3-a77e144e8ab3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84b587ef-d196-4e10-83df-6c7772bec83e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f82b70ddbd84b29baad3bb3a8bc340d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'eeda88b2-1534-4932-a80b-26165748fdb9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd2c591e-d0ea-472e-82c2-94f32e196de6, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=36da1c62-1081-4041-ac32-4925d6a4ecb8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.761 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 36da1c62-1081-4041-ac32-4925d6a4ecb8 in datapath 84b587ef-d196-4e10-83df-6c7772bec83e bound to our chassis
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.762 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84b587ef-d196-4e10-83df-6c7772bec83e
Feb 23 11:21:45 compute-0 systemd-machined[156970]: New machine qemu-19-instance-0000001b.
Feb 23 11:21:45 compute-0 systemd-udevd[217330]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:21:45 compute-0 ovn_controller[97601]: 2026-02-23T11:21:45Z|00203|binding|INFO|Setting lport 36da1c62-1081-4041-ac32-4925d6a4ecb8 ovn-installed in OVS
Feb 23 11:21:45 compute-0 ovn_controller[97601]: 2026-02-23T11:21:45Z|00204|binding|INFO|Setting lport 36da1c62-1081-4041-ac32-4925d6a4ecb8 up in Southbound
Feb 23 11:21:45 compute-0 nova_compute[187639]: 2026-02-23 11:21:45.774 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:45 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000001b.
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.776 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[ea587b59-47fb-4575-a378-470f9eefc32a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.776 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap84b587ef-d1 in ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.779 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap84b587ef-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.779 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[517be4c8-e361-4855-877f-2ea5625b38c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 NetworkManager[57207]: <info>  [1771845705.7818] device (tap36da1c62-10): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.781 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d320134e-2b0e-4002-bc00-1535179adf1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 NetworkManager[57207]: <info>  [1771845705.7830] device (tap36da1c62-10): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.792 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[11b3aaba-267a-4a11-b60c-06db23b5d56f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.809 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[32218cf1-0579-4a40-bf28-9a878fdb5204]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.830 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[42586fd7-2f28-4307-804d-2efb58c61f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 NetworkManager[57207]: <info>  [1771845705.8371] manager: (tap84b587ef-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.837 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1d2f5d-b354-4107-93ec-d53bfb6321dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.868 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7a5b02-da76-40ee-a2b0-9f73c680506f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.872 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[613d48ea-94e7-42e3-a42c-c4216ce898fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 NetworkManager[57207]: <info>  [1771845705.8875] device (tap84b587ef-d0): carrier: link connected
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.889 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[1998aa86-8c67-4b36-8e07-9aaf7fc90522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.904 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff7f1a1-68fc-4999-a68c-ee7db8ce43ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84b587ef-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:7a:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485005, 'reachable_time': 24929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217363, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.914 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[006bea7b-9268-4983-9ea8-e903380d9e40]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:7a73'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485005, 'tstamp': 485005}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217364, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.928 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5f8a8a-141f-46f9-8086-6f072c7ede7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84b587ef-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:7a:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485005, 'reachable_time': 24929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217365, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.951 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[125d3643-bb68-4599-a0bd-f3aad5943c15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.998 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[16e6ca29-47ec-43cf-ad6c-4f53a8d94963]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.999 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84b587ef-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.999 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:21:45 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:45.999 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84b587ef-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:21:46 compute-0 kernel: tap84b587ef-d0: entered promiscuous mode
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.001 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:46 compute-0 NetworkManager[57207]: <info>  [1771845706.0021] manager: (tap84b587ef-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:46.004 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84b587ef-d0, col_values=(('external_ids', {'iface-id': '5301ae8b-3d15-4378-9855-275f31f571b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.005 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:46 compute-0 ovn_controller[97601]: 2026-02-23T11:21:46Z|00205|binding|INFO|Releasing lport 5301ae8b-3d15-4378-9855-275f31f571b5 from this chassis (sb_readonly=0)
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.006 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:46.007 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/84b587ef-d196-4e10-83df-6c7772bec83e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/84b587ef-d196-4e10-83df-6c7772bec83e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:46.007 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[a04175e5-0bef-4b23-b57d-73293469e6a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:46.008 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-84b587ef-d196-4e10-83df-6c7772bec83e
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/84b587ef-d196-4e10-83df-6c7772bec83e.pid.haproxy
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 84b587ef-d196-4e10-83df-6c7772bec83e
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:21:46 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:21:46.008 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'env', 'PROCESS_TAG=haproxy-84b587ef-d196-4e10-83df-6c7772bec83e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/84b587ef-d196-4e10-83df-6c7772bec83e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.009 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.062 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845706.0620806, 1e192505-df0b-49ed-8cf3-a77e144e8ab3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.063 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] VM Started (Lifecycle Event)
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.080 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.086 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845706.062246, 1e192505-df0b-49ed-8cf3-a77e144e8ab3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.086 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] VM Paused (Lifecycle Event)
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.110 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.114 187643 DEBUG nova.compute.manager [req-6c31c138-d8aa-4455-9d3f-61d2d2e39f07 req-ea6e1c86-b4ac-4de9-8751-50495d3f9e9d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received event network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.114 187643 DEBUG oslo_concurrency.lockutils [req-6c31c138-d8aa-4455-9d3f-61d2d2e39f07 req-ea6e1c86-b4ac-4de9-8751-50495d3f9e9d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.115 187643 DEBUG oslo_concurrency.lockutils [req-6c31c138-d8aa-4455-9d3f-61d2d2e39f07 req-ea6e1c86-b4ac-4de9-8751-50495d3f9e9d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.115 187643 DEBUG oslo_concurrency.lockutils [req-6c31c138-d8aa-4455-9d3f-61d2d2e39f07 req-ea6e1c86-b4ac-4de9-8751-50495d3f9e9d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.116 187643 DEBUG nova.compute.manager [req-6c31c138-d8aa-4455-9d3f-61d2d2e39f07 req-ea6e1c86-b4ac-4de9-8751-50495d3f9e9d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Processing event network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.117 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.119 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.122 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.124 187643 INFO nova.virt.libvirt.driver [-] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Instance spawned successfully.
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.124 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.139 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.139 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845706.120614, 1e192505-df0b-49ed-8cf3-a77e144e8ab3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.140 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] VM Resumed (Lifecycle Event)
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.154 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.154 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.155 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.155 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.156 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.157 187643 DEBUG nova.virt.libvirt.driver [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.163 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.166 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.176 187643 DEBUG nova.network.neutron [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Updated VIF entry in instance network info cache for port 36da1c62-1081-4041-ac32-4925d6a4ecb8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.176 187643 DEBUG nova.network.neutron [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Updating instance_info_cache with network_info: [{"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.192 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.205 187643 DEBUG oslo_concurrency.lockutils [req-8c49621c-bba9-4620-9bb5-1f9ef179de4a req-32d82fc8-7def-4d09-85a6-5153ae4945cd 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.220 187643 INFO nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Took 5.20 seconds to spawn the instance on the hypervisor.
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.220 187643 DEBUG nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.276 187643 INFO nova.compute.manager [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Took 5.73 seconds to build instance.
Feb 23 11:21:46 compute-0 nova_compute[187639]: 2026-02-23 11:21:46.290 187643 DEBUG oslo_concurrency.lockutils [None req-a6aef006-6a8d-4c45-98d5-ffb9ac923afc 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:46 compute-0 podman[217405]: 2026-02-23 11:21:46.365358479 +0000 UTC m=+0.065931681 container create d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 11:21:46 compute-0 systemd[1]: Started libpod-conmon-d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e.scope.
Feb 23 11:21:46 compute-0 podman[217405]: 2026-02-23 11:21:46.329836852 +0000 UTC m=+0.030410094 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:21:46 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:21:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1f59920ff2f33c48d29b3fbe1aa4c66d65909698e3e68bf32d422d5533c17ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:21:46 compute-0 podman[217405]: 2026-02-23 11:21:46.469335063 +0000 UTC m=+0.169908305 container init d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:21:46 compute-0 podman[217405]: 2026-02-23 11:21:46.477049857 +0000 UTC m=+0.177623059 container start d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216)
Feb 23 11:21:46 compute-0 neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e[217420]: [NOTICE]   (217424) : New worker (217426) forked
Feb 23 11:21:46 compute-0 neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e[217420]: [NOTICE]   (217424) : Loading success.
Feb 23 11:21:48 compute-0 nova_compute[187639]: 2026-02-23 11:21:48.230 187643 DEBUG nova.compute.manager [req-258a9a66-6ead-491f-baa4-d06f10eaaba9 req-ee4d7e01-81f8-4bf5-bb03-f8d3e8d38433 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received event network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:21:48 compute-0 nova_compute[187639]: 2026-02-23 11:21:48.232 187643 DEBUG oslo_concurrency.lockutils [req-258a9a66-6ead-491f-baa4-d06f10eaaba9 req-ee4d7e01-81f8-4bf5-bb03-f8d3e8d38433 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:21:48 compute-0 nova_compute[187639]: 2026-02-23 11:21:48.233 187643 DEBUG oslo_concurrency.lockutils [req-258a9a66-6ead-491f-baa4-d06f10eaaba9 req-ee4d7e01-81f8-4bf5-bb03-f8d3e8d38433 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:21:48 compute-0 nova_compute[187639]: 2026-02-23 11:21:48.233 187643 DEBUG oslo_concurrency.lockutils [req-258a9a66-6ead-491f-baa4-d06f10eaaba9 req-ee4d7e01-81f8-4bf5-bb03-f8d3e8d38433 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:21:48 compute-0 nova_compute[187639]: 2026-02-23 11:21:48.234 187643 DEBUG nova.compute.manager [req-258a9a66-6ead-491f-baa4-d06f10eaaba9 req-ee4d7e01-81f8-4bf5-bb03-f8d3e8d38433 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] No waiting events found dispatching network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:21:48 compute-0 nova_compute[187639]: 2026-02-23 11:21:48.234 187643 WARNING nova.compute.manager [req-258a9a66-6ead-491f-baa4-d06f10eaaba9 req-ee4d7e01-81f8-4bf5-bb03-f8d3e8d38433 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received unexpected event network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 for instance with vm_state active and task_state None.
Feb 23 11:21:49 compute-0 nova_compute[187639]: 2026-02-23 11:21:49.738 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:49 compute-0 nova_compute[187639]: 2026-02-23 11:21:49.746 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:52 compute-0 podman[217437]: 2026-02-23 11:21:52.855109543 +0000 UTC m=+0.060782595 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller)
Feb 23 11:21:54 compute-0 nova_compute[187639]: 2026-02-23 11:21:54.748 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:21:54 compute-0 nova_compute[187639]: 2026-02-23 11:21:54.752 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:21:54 compute-0 nova_compute[187639]: 2026-02-23 11:21:54.752 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 11:21:54 compute-0 nova_compute[187639]: 2026-02-23 11:21:54.753 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:21:54 compute-0 nova_compute[187639]: 2026-02-23 11:21:54.774 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:21:54 compute-0 nova_compute[187639]: 2026-02-23 11:21:54.775 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:21:55 compute-0 podman[217464]: 2026-02-23 11:21:55.867836943 +0000 UTC m=+0.061053772 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 23 11:21:58 compute-0 ovn_controller[97601]: 2026-02-23T11:21:58Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:57:0e 10.100.0.8
Feb 23 11:21:58 compute-0 ovn_controller[97601]: 2026-02-23T11:21:58Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:57:0e 10.100.0.8
Feb 23 11:21:59 compute-0 podman[197002]: time="2026-02-23T11:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:21:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:21:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 23 11:21:59 compute-0 nova_compute[187639]: 2026-02-23 11:21:59.776 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:22:00 compute-0 sshd-session[217497]: Invalid user user from 143.198.30.3 port 47630
Feb 23 11:22:00 compute-0 sshd-session[217497]: Connection closed by invalid user user 143.198.30.3 port 47630 [preauth]
Feb 23 11:22:01 compute-0 openstack_network_exporter[199919]: ERROR   11:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:22:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:22:01 compute-0 openstack_network_exporter[199919]: ERROR   11:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:22:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:22:04 compute-0 nova_compute[187639]: 2026-02-23 11:22:04.778 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:22:04 compute-0 nova_compute[187639]: 2026-02-23 11:22:04.780 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:22:04 compute-0 nova_compute[187639]: 2026-02-23 11:22:04.781 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 11:22:04 compute-0 nova_compute[187639]: 2026-02-23 11:22:04.781 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:22:04 compute-0 nova_compute[187639]: 2026-02-23 11:22:04.895 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:04 compute-0 nova_compute[187639]: 2026-02-23 11:22:04.895 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 11:22:06 compute-0 podman[217499]: 2026-02-23 11:22:06.842215201 +0000 UTC m=+0.042708998 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:22:08 compute-0 nova_compute[187639]: 2026-02-23 11:22:08.081 187643 DEBUG nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Creating tmpfile /var/lib/nova/instances/tmpeexkrs9m to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 23 11:22:08 compute-0 nova_compute[187639]: 2026-02-23 11:22:08.177 187643 DEBUG nova.compute.manager [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeexkrs9m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 23 11:22:09 compute-0 nova_compute[187639]: 2026-02-23 11:22:09.128 187643 DEBUG nova.compute.manager [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeexkrs9m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b71bc235-14d4-46a5-8f6d-bd5dc25af5a2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 23 11:22:09 compute-0 nova_compute[187639]: 2026-02-23 11:22:09.163 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "refresh_cache-b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:22:09 compute-0 nova_compute[187639]: 2026-02-23 11:22:09.164 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquired lock "refresh_cache-b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:22:09 compute-0 nova_compute[187639]: 2026-02-23 11:22:09.164 187643 DEBUG nova.network.neutron [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:22:09 compute-0 nova_compute[187639]: 2026-02-23 11:22:09.897 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.614 187643 DEBUG nova.network.neutron [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Updating instance_info_cache with network_info: [{"id": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "address": "fa:16:3e:f1:cd:77", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d1fd389-ac", "ovs_interfaceid": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.640 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Releasing lock "refresh_cache-b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.641 187643 DEBUG nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeexkrs9m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b71bc235-14d4-46a5-8f6d-bd5dc25af5a2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.642 187643 DEBUG nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Creating instance directory: /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.642 187643 DEBUG nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Creating disk.info with the contents: {'/var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk': 'qcow2', '/var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.642 187643 DEBUG nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.643 187643 DEBUG nova.objects.instance [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.676 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.719 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.720 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.721 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.742 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.800 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.801 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.823 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.824 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.824 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.868 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.869 187643 DEBUG nova.virt.disk.api [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Checking if we can resize image /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.870 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.916 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.917 187643 DEBUG nova.virt.disk.api [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Cannot resize image /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.917 187643 DEBUG nova.objects.instance [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lazy-loading 'migration_context' on Instance uuid b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.934 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.949 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk.config 485376" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.950 187643 DEBUG nova.virt.libvirt.volume.remotefs [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk.config to /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 23 11:22:10 compute-0 nova_compute[187639]: 2026-02-23 11:22:10.951 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk.config /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.121 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.325 187643 DEBUG oslo_concurrency.processutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk.config /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" returned: 0 in 0.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.326 187643 DEBUG nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.328 187643 DEBUG nova.virt.libvirt.vif [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:21:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1201356693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1201356693',id=28,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:21:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f82b70ddbd84b29baad3bb3a8bc340d',ramdisk_id='',reservation_id='r-5ksxs7l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:22:00Z,user_data=None,user_id='1cc69df7c7464f81ae1446f3587ebd7e',uuid=b71bc235-14d4-46a5-8f6d-bd5dc25af5a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "address": "fa:16:3e:f1:cd:77", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d1fd389-ac", "ovs_interfaceid": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.329 187643 DEBUG nova.network.os_vif_util [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Converting VIF {"id": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "address": "fa:16:3e:f1:cd:77", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d1fd389-ac", "ovs_interfaceid": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.330 187643 DEBUG nova.network.os_vif_util [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:cd:77,bridge_name='br-int',has_traffic_filtering=True,id=1d1fd389-ac11-43ba-9bb5-45c9794c71bc,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d1fd389-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.331 187643 DEBUG os_vif [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:cd:77,bridge_name='br-int',has_traffic_filtering=True,id=1d1fd389-ac11-43ba-9bb5-45c9794c71bc,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d1fd389-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.332 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.333 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.334 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.337 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.338 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d1fd389-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.339 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1d1fd389-ac, col_values=(('external_ids', {'iface-id': '1d1fd389-ac11-43ba-9bb5-45c9794c71bc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:cd:77', 'vm-uuid': 'b71bc235-14d4-46a5-8f6d-bd5dc25af5a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.340 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:11 compute-0 NetworkManager[57207]: <info>  [1771845731.3413] manager: (tap1d1fd389-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.343 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.346 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.347 187643 INFO os_vif [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:cd:77,bridge_name='br-int',has_traffic_filtering=True,id=1d1fd389-ac11-43ba-9bb5-45c9794c71bc,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d1fd389-ac')
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.348 187643 DEBUG nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 23 11:22:11 compute-0 nova_compute[187639]: 2026-02-23 11:22:11.348 187643 DEBUG nova.compute.manager [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeexkrs9m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b71bc235-14d4-46a5-8f6d-bd5dc25af5a2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 23 11:22:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:12.667 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:12.667 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:12.668 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:13 compute-0 sshd-session[217547]: Invalid user admin from 165.227.79.48 port 39036
Feb 23 11:22:13 compute-0 sshd-session[217547]: Connection closed by invalid user admin 165.227.79.48 port 39036 [preauth]
Feb 23 11:22:13 compute-0 podman[217549]: 2026-02-23 11:22:13.12947691 +0000 UTC m=+0.045141622 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 23 11:22:13 compute-0 nova_compute[187639]: 2026-02-23 11:22:13.684 187643 DEBUG nova.network.neutron [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Port 1d1fd389-ac11-43ba-9bb5-45c9794c71bc updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 23 11:22:13 compute-0 nova_compute[187639]: 2026-02-23 11:22:13.687 187643 DEBUG nova.compute.manager [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeexkrs9m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b71bc235-14d4-46a5-8f6d-bd5dc25af5a2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 23 11:22:14 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 23 11:22:14 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 23 11:22:14 compute-0 kernel: tap1d1fd389-ac: entered promiscuous mode
Feb 23 11:22:14 compute-0 NetworkManager[57207]: <info>  [1771845734.1518] manager: (tap1d1fd389-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Feb 23 11:22:14 compute-0 ovn_controller[97601]: 2026-02-23T11:22:14Z|00206|binding|INFO|Claiming lport 1d1fd389-ac11-43ba-9bb5-45c9794c71bc for this additional chassis.
Feb 23 11:22:14 compute-0 ovn_controller[97601]: 2026-02-23T11:22:14Z|00207|binding|INFO|1d1fd389-ac11-43ba-9bb5-45c9794c71bc: Claiming fa:16:3e:f1:cd:77 10.100.0.11
Feb 23 11:22:14 compute-0 nova_compute[187639]: 2026-02-23 11:22:14.154 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:14 compute-0 ovn_controller[97601]: 2026-02-23T11:22:14Z|00208|binding|INFO|Setting lport 1d1fd389-ac11-43ba-9bb5-45c9794c71bc ovn-installed in OVS
Feb 23 11:22:14 compute-0 nova_compute[187639]: 2026-02-23 11:22:14.168 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:14 compute-0 nova_compute[187639]: 2026-02-23 11:22:14.170 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:14 compute-0 systemd-machined[156970]: New machine qemu-20-instance-0000001c.
Feb 23 11:22:14 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001c.
Feb 23 11:22:14 compute-0 systemd-udevd[217603]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:22:14 compute-0 NetworkManager[57207]: <info>  [1771845734.2133] device (tap1d1fd389-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:22:14 compute-0 NetworkManager[57207]: <info>  [1771845734.2138] device (tap1d1fd389-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:22:14 compute-0 nova_compute[187639]: 2026-02-23 11:22:14.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:14 compute-0 nova_compute[187639]: 2026-02-23 11:22:14.899 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.487 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845735.4873111, b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.488 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] VM Started (Lifecycle Event)
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.513 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.863 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.864 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.865 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:22:15 compute-0 nova_compute[187639]: 2026-02-23 11:22:15.865 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1e192505-df0b-49ed-8cf3-a77e144e8ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:22:16 compute-0 nova_compute[187639]: 2026-02-23 11:22:16.227 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845736.2270992, b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:22:16 compute-0 nova_compute[187639]: 2026-02-23 11:22:16.228 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] VM Resumed (Lifecycle Event)
Feb 23 11:22:16 compute-0 nova_compute[187639]: 2026-02-23 11:22:16.259 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:22:16 compute-0 nova_compute[187639]: 2026-02-23 11:22:16.261 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:22:16 compute-0 nova_compute[187639]: 2026-02-23 11:22:16.282 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 23 11:22:16 compute-0 nova_compute[187639]: 2026-02-23 11:22:16.340 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.674 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Updating instance_info_cache with network_info: [{"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.691 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-1e192505-df0b-49ed-8cf3-a77e144e8ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:22:18 compute-0 ovn_controller[97601]: 2026-02-23T11:22:18Z|00209|binding|INFO|Claiming lport 1d1fd389-ac11-43ba-9bb5-45c9794c71bc for this chassis.
Feb 23 11:22:18 compute-0 ovn_controller[97601]: 2026-02-23T11:22:18Z|00210|binding|INFO|1d1fd389-ac11-43ba-9bb5-45c9794c71bc: Claiming fa:16:3e:f1:cd:77 10.100.0.11
Feb 23 11:22:18 compute-0 ovn_controller[97601]: 2026-02-23T11:22:18Z|00211|binding|INFO|Setting lport 1d1fd389-ac11-43ba-9bb5-45c9794c71bc up in Southbound
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.741 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:cd:77 10.100.0.11'], port_security=['fa:16:3e:f1:cd:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b71bc235-14d4-46a5-8f6d-bd5dc25af5a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84b587ef-d196-4e10-83df-6c7772bec83e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f82b70ddbd84b29baad3bb3a8bc340d', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'eeda88b2-1534-4932-a80b-26165748fdb9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd2c591e-d0ea-472e-82c2-94f32e196de6, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=1d1fd389-ac11-43ba-9bb5-45c9794c71bc) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.742 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 1d1fd389-ac11-43ba-9bb5-45c9794c71bc in datapath 84b587ef-d196-4e10-83df-6c7772bec83e bound to our chassis
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.743 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84b587ef-d196-4e10-83df-6c7772bec83e
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.752 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb1eb65-ab96-420d-bead-77731d915704]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.770 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2ecd26-9a85-4c54-9928-f55aa081e973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.773 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[470d968f-22a0-40a8-adae-de4f15032916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.790 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[b6882126-920f-4fe9-be81-65a4b29f16b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.802 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2b313b04-5ad0-4d57-b562-4a91a209cca9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84b587ef-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:7a:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 658, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 658, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485005, 'reachable_time': 25597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217636, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.817 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2857cd4a-7219-43ae-b084-9a57d82cbe49]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap84b587ef-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485013, 'tstamp': 485013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217637, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap84b587ef-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485016, 'tstamp': 485016}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217637, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.818 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84b587ef-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.820 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:18 compute-0 nova_compute[187639]: 2026-02-23 11:22:18.820 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.821 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84b587ef-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.821 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.821 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84b587ef-d0, col_values=(('external_ids', {'iface-id': '5301ae8b-3d15-4378-9855-275f31f571b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:18 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:18.821 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:22:19 compute-0 nova_compute[187639]: 2026-02-23 11:22:19.901 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:19 compute-0 nova_compute[187639]: 2026-02-23 11:22:19.922 187643 INFO nova.compute.manager [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Post operation of migration started
Feb 23 11:22:20 compute-0 nova_compute[187639]: 2026-02-23 11:22:20.199 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "refresh_cache-b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:22:20 compute-0 nova_compute[187639]: 2026-02-23 11:22:20.199 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquired lock "refresh_cache-b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:22:20 compute-0 nova_compute[187639]: 2026-02-23 11:22:20.199 187643 DEBUG nova.network.neutron [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:22:20 compute-0 nova_compute[187639]: 2026-02-23 11:22:20.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:20 compute-0 nova_compute[187639]: 2026-02-23 11:22:20.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:21 compute-0 nova_compute[187639]: 2026-02-23 11:22:21.343 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:21 compute-0 nova_compute[187639]: 2026-02-23 11:22:21.967 187643 DEBUG nova.network.neutron [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Updating instance_info_cache with network_info: [{"id": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "address": "fa:16:3e:f1:cd:77", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d1fd389-ac", "ovs_interfaceid": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:22:21 compute-0 nova_compute[187639]: 2026-02-23 11:22:21.984 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Releasing lock "refresh_cache-b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:22:22 compute-0 nova_compute[187639]: 2026-02-23 11:22:22.001 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:22 compute-0 nova_compute[187639]: 2026-02-23 11:22:22.002 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:22 compute-0 nova_compute[187639]: 2026-02-23 11:22:22.002 187643 DEBUG oslo_concurrency.lockutils [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:22 compute-0 nova_compute[187639]: 2026-02-23 11:22:22.007 187643 INFO nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 23 11:22:22 compute-0 virtqemud[186733]: Domain id=20 name='instance-0000001c' uuid=b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 is tainted: custom-monitor
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.012 187643 INFO nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.724 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.724 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.724 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.724 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.782 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:23 compute-0 podman[217640]: 2026-02-23 11:22:23.83372533 +0000 UTC m=+0.067059491 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.840 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.840 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.880 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.884 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.923 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.924 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:22:23 compute-0 nova_compute[187639]: 2026-02-23 11:22:23.966 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.017 187643 INFO nova.virt.libvirt.driver [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.021 187643 DEBUG nova.compute.manager [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.048 187643 DEBUG nova.objects.instance [None req-135dd174-310c-4902-87a0-046400fe433b d31d70688c13462785910920b902923f c68d4cfcc5c84067ae596f11bca9a9e4 - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.100 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.101 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5515MB free_disk=73.14642715454102GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.101 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.102 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.164 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Migration for instance b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.191 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.219 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance 1e192505-df0b-49ed-8cf3-a77e144e8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.239 187643 WARNING nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Instance b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.239 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.239 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.290 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.302 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.320 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.321 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:24 compute-0 nova_compute[187639]: 2026-02-23 11:22:24.943 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:26 compute-0 nova_compute[187639]: 2026-02-23 11:22:26.345 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:26 compute-0 podman[217680]: 2026-02-23 11:22:26.873731499 +0000 UTC m=+0.061480674 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 11:22:29 compute-0 podman[197002]: time="2026-02-23T11:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:22:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:22:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Feb 23 11:22:29 compute-0 nova_compute[187639]: 2026-02-23 11:22:29.943 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.498 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.499 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.499 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.499 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.499 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.500 187643 INFO nova.compute.manager [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Terminating instance
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.501 187643 DEBUG nova.compute.manager [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:22:30 compute-0 kernel: tap1d1fd389-ac (unregistering): left promiscuous mode
Feb 23 11:22:30 compute-0 NetworkManager[57207]: <info>  [1771845750.5283] device (tap1d1fd389-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:22:30 compute-0 ovn_controller[97601]: 2026-02-23T11:22:30Z|00212|binding|INFO|Releasing lport 1d1fd389-ac11-43ba-9bb5-45c9794c71bc from this chassis (sb_readonly=0)
Feb 23 11:22:30 compute-0 ovn_controller[97601]: 2026-02-23T11:22:30Z|00213|binding|INFO|Setting lport 1d1fd389-ac11-43ba-9bb5-45c9794c71bc down in Southbound
Feb 23 11:22:30 compute-0 ovn_controller[97601]: 2026-02-23T11:22:30Z|00214|binding|INFO|Removing iface tap1d1fd389-ac ovn-installed in OVS
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.540 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.542 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.544 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.548 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:cd:77 10.100.0.11'], port_security=['fa:16:3e:f1:cd:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b71bc235-14d4-46a5-8f6d-bd5dc25af5a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84b587ef-d196-4e10-83df-6c7772bec83e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f82b70ddbd84b29baad3bb3a8bc340d', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'eeda88b2-1534-4932-a80b-26165748fdb9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd2c591e-d0ea-472e-82c2-94f32e196de6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=1d1fd389-ac11-43ba-9bb5-45c9794c71bc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.551 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 1d1fd389-ac11-43ba-9bb5-45c9794c71bc in datapath 84b587ef-d196-4e10-83df-6c7772bec83e unbound from our chassis
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.552 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84b587ef-d196-4e10-83df-6c7772bec83e
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.563 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[00defaa1-6ea4-4958-8182-455ea32b6e31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:30 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Feb 23 11:22:30 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001c.scope: Consumed 2.126s CPU time.
Feb 23 11:22:30 compute-0 systemd-machined[156970]: Machine qemu-20-instance-0000001c terminated.
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.592 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec2c7be-8f54-46e9-9d15-164c8442070d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.594 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[76a48c4f-cbfb-4ade-b359-bd1c2b885b7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.614 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[d78b399a-7673-4dd5-80e6-e3345863b8d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.624 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f782522c-1620-46b2-aa02-a193b1cb0411]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84b587ef-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:7a:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 25, 'tx_packets': 7, 'rx_bytes': 1330, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 25, 'tx_packets': 7, 'rx_bytes': 1330, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485005, 'reachable_time': 25597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217712, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.634 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[66c816c0-61ad-42ae-9b39-102d07a78ea7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap84b587ef-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485013, 'tstamp': 485013}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217713, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap84b587ef-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485016, 'tstamp': 485016}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217713, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.635 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84b587ef-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.637 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.640 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.640 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84b587ef-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.640 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.641 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84b587ef-d0, col_values=(('external_ids', {'iface-id': '5301ae8b-3d15-4378-9855-275f31f571b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:30 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:30.641 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.715 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.719 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.751 187643 INFO nova.virt.libvirt.driver [-] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Instance destroyed successfully.
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.751 187643 DEBUG nova.objects.instance [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lazy-loading 'resources' on Instance uuid b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.765 187643 DEBUG nova.virt.libvirt.vif [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-23T11:21:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1201356693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1201356693',id=28,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:21:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f82b70ddbd84b29baad3bb3a8bc340d',ramdisk_id='',reservation_id='r-5ksxs7l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:22:24Z,user_data=None,user_id='1cc69df7c7464f81ae1446f3587ebd7e',uuid=b71bc235-14d4-46a5-8f6d-bd5dc25af5a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "address": "fa:16:3e:f1:cd:77", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d1fd389-ac", "ovs_interfaceid": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.765 187643 DEBUG nova.network.os_vif_util [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converting VIF {"id": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "address": "fa:16:3e:f1:cd:77", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d1fd389-ac", "ovs_interfaceid": "1d1fd389-ac11-43ba-9bb5-45c9794c71bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.766 187643 DEBUG nova.network.os_vif_util [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f1:cd:77,bridge_name='br-int',has_traffic_filtering=True,id=1d1fd389-ac11-43ba-9bb5-45c9794c71bc,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d1fd389-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.766 187643 DEBUG os_vif [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:cd:77,bridge_name='br-int',has_traffic_filtering=True,id=1d1fd389-ac11-43ba-9bb5-45c9794c71bc,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d1fd389-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.767 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.768 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d1fd389-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.769 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.771 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.773 187643 INFO os_vif [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:cd:77,bridge_name='br-int',has_traffic_filtering=True,id=1d1fd389-ac11-43ba-9bb5-45c9794c71bc,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d1fd389-ac')
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.774 187643 INFO nova.virt.libvirt.driver [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Deleting instance files /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2_del
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.774 187643 INFO nova.virt.libvirt.driver [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Deletion of /var/lib/nova/instances/b71bc235-14d4-46a5-8f6d-bd5dc25af5a2_del complete
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.782 187643 DEBUG nova.compute.manager [req-02e65e68-05bc-4a5b-a3ae-421139173b0c req-ef007669-c090-4cee-9ae2-18614f227498 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Received event network-vif-unplugged-1d1fd389-ac11-43ba-9bb5-45c9794c71bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.783 187643 DEBUG oslo_concurrency.lockutils [req-02e65e68-05bc-4a5b-a3ae-421139173b0c req-ef007669-c090-4cee-9ae2-18614f227498 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.783 187643 DEBUG oslo_concurrency.lockutils [req-02e65e68-05bc-4a5b-a3ae-421139173b0c req-ef007669-c090-4cee-9ae2-18614f227498 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.783 187643 DEBUG oslo_concurrency.lockutils [req-02e65e68-05bc-4a5b-a3ae-421139173b0c req-ef007669-c090-4cee-9ae2-18614f227498 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.783 187643 DEBUG nova.compute.manager [req-02e65e68-05bc-4a5b-a3ae-421139173b0c req-ef007669-c090-4cee-9ae2-18614f227498 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] No waiting events found dispatching network-vif-unplugged-1d1fd389-ac11-43ba-9bb5-45c9794c71bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.783 187643 DEBUG nova.compute.manager [req-02e65e68-05bc-4a5b-a3ae-421139173b0c req-ef007669-c090-4cee-9ae2-18614f227498 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Received event network-vif-unplugged-1d1fd389-ac11-43ba-9bb5-45c9794c71bc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.818 187643 INFO nova.compute.manager [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Took 0.32 seconds to destroy the instance on the hypervisor.
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.818 187643 DEBUG oslo.service.loopingcall [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.819 187643 DEBUG nova.compute.manager [-] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:22:30 compute-0 nova_compute[187639]: 2026-02-23 11:22:30.819 187643 DEBUG nova.network.neutron [-] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:22:31 compute-0 sshd-session[217731]: Invalid user user from 143.198.30.3 port 56064
Feb 23 11:22:31 compute-0 sshd-session[217731]: Connection closed by invalid user user 143.198.30.3 port 56064 [preauth]
Feb 23 11:22:31 compute-0 openstack_network_exporter[199919]: ERROR   11:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:22:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:22:31 compute-0 openstack_network_exporter[199919]: ERROR   11:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:22:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.433 187643 DEBUG nova.network.neutron [-] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.447 187643 INFO nova.compute.manager [-] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Took 0.63 seconds to deallocate network for instance.
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.508 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.509 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.515 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.543 187643 DEBUG nova.compute.manager [req-7bc220a1-ac95-4516-9a98-8b2bd13aceb1 req-b60cbd4e-583f-447d-bde3-538419fb2ac8 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Received event network-vif-deleted-1d1fd389-ac11-43ba-9bb5-45c9794c71bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.550 187643 INFO nova.scheduler.client.report [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Deleted allocations for instance b71bc235-14d4-46a5-8f6d-bd5dc25af5a2
Feb 23 11:22:31 compute-0 nova_compute[187639]: 2026-02-23 11:22:31.609 187643 DEBUG oslo_concurrency.lockutils [None req-7a0cd07c-bc17-4e04-9edc-f131d3ee1936 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.454 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.455 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.455 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.455 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.456 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.457 187643 INFO nova.compute.manager [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Terminating instance
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.459 187643 DEBUG nova.compute.manager [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 23 11:22:32 compute-0 kernel: tap36da1c62-10 (unregistering): left promiscuous mode
Feb 23 11:22:32 compute-0 NetworkManager[57207]: <info>  [1771845752.4876] device (tap36da1c62-10): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:22:32 compute-0 ovn_controller[97601]: 2026-02-23T11:22:32Z|00215|binding|INFO|Releasing lport 36da1c62-1081-4041-ac32-4925d6a4ecb8 from this chassis (sb_readonly=0)
Feb 23 11:22:32 compute-0 ovn_controller[97601]: 2026-02-23T11:22:32Z|00216|binding|INFO|Setting lport 36da1c62-1081-4041-ac32-4925d6a4ecb8 down in Southbound
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.489 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:32 compute-0 ovn_controller[97601]: 2026-02-23T11:22:32Z|00217|binding|INFO|Removing iface tap36da1c62-10 ovn-installed in OVS
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.498 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:57:0e 10.100.0.8'], port_security=['fa:16:3e:45:57:0e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e192505-df0b-49ed-8cf3-a77e144e8ab3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84b587ef-d196-4e10-83df-6c7772bec83e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f82b70ddbd84b29baad3bb3a8bc340d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'eeda88b2-1534-4932-a80b-26165748fdb9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd2c591e-d0ea-472e-82c2-94f32e196de6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=36da1c62-1081-4041-ac32-4925d6a4ecb8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.500 106968 INFO neutron.agent.ovn.metadata.agent [-] Port 36da1c62-1081-4041-ac32-4925d6a4ecb8 in datapath 84b587ef-d196-4e10-83df-6c7772bec83e unbound from our chassis
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.503 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 84b587ef-d196-4e10-83df-6c7772bec83e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.503 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.504 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1548add5-f8e9-4ead-add5-da57751bea90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.504 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e namespace which is not needed anymore
Feb 23 11:22:32 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Feb 23 11:22:32 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001b.scope: Consumed 12.933s CPU time.
Feb 23 11:22:32 compute-0 systemd-machined[156970]: Machine qemu-19-instance-0000001b terminated.
Feb 23 11:22:32 compute-0 neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e[217420]: [NOTICE]   (217424) : haproxy version is 2.8.14-c23fe91
Feb 23 11:22:32 compute-0 neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e[217420]: [NOTICE]   (217424) : path to executable is /usr/sbin/haproxy
Feb 23 11:22:32 compute-0 neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e[217420]: [WARNING]  (217424) : Exiting Master process...
Feb 23 11:22:32 compute-0 neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e[217420]: [ALERT]    (217424) : Current worker (217426) exited with code 143 (Terminated)
Feb 23 11:22:32 compute-0 neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e[217420]: [WARNING]  (217424) : All workers exited. Exiting... (0)
Feb 23 11:22:32 compute-0 systemd[1]: libpod-d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e.scope: Deactivated successfully.
Feb 23 11:22:32 compute-0 podman[217753]: 2026-02-23 11:22:32.629206175 +0000 UTC m=+0.043626663 container died d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 11:22:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e-userdata-shm.mount: Deactivated successfully.
Feb 23 11:22:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-f1f59920ff2f33c48d29b3fbe1aa4c66d65909698e3e68bf32d422d5533c17ba-merged.mount: Deactivated successfully.
Feb 23 11:22:32 compute-0 podman[217753]: 2026-02-23 11:22:32.664733082 +0000 UTC m=+0.079153560 container cleanup d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:22:32 compute-0 systemd[1]: libpod-conmon-d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e.scope: Deactivated successfully.
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.708 187643 INFO nova.virt.libvirt.driver [-] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Instance destroyed successfully.
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.710 187643 DEBUG nova.objects.instance [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lazy-loading 'resources' on Instance uuid 1e192505-df0b-49ed-8cf3-a77e144e8ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.725 187643 DEBUG nova.virt.libvirt.vif [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:21:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1004673487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1004673487',id=27,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:21:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f82b70ddbd84b29baad3bb3a8bc340d',ramdisk_id='',reservation_id='r-0zu0clyc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-63810276-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:21:46Z,user_data=None,user_id='1cc69df7c7464f81ae1446f3587ebd7e',uuid=1e192505-df0b-49ed-8cf3-a77e144e8ab3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.726 187643 DEBUG nova.network.os_vif_util [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converting VIF {"id": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "address": "fa:16:3e:45:57:0e", "network": {"id": "84b587ef-d196-4e10-83df-6c7772bec83e", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1900665839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f82b70ddbd84b29baad3bb3a8bc340d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36da1c62-10", "ovs_interfaceid": "36da1c62-1081-4041-ac32-4925d6a4ecb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.726 187643 DEBUG nova.network.os_vif_util [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:57:0e,bridge_name='br-int',has_traffic_filtering=True,id=36da1c62-1081-4041-ac32-4925d6a4ecb8,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36da1c62-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.726 187643 DEBUG os_vif [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:57:0e,bridge_name='br-int',has_traffic_filtering=True,id=36da1c62-1081-4041-ac32-4925d6a4ecb8,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36da1c62-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.727 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.728 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36da1c62-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.729 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.732 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.733 187643 INFO os_vif [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:57:0e,bridge_name='br-int',has_traffic_filtering=True,id=36da1c62-1081-4041-ac32-4925d6a4ecb8,network=Network(84b587ef-d196-4e10-83df-6c7772bec83e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36da1c62-10')
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.733 187643 INFO nova.virt.libvirt.driver [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Deleting instance files /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3_del
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.734 187643 INFO nova.virt.libvirt.driver [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Deletion of /var/lib/nova/instances/1e192505-df0b-49ed-8cf3-a77e144e8ab3_del complete
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.786 187643 INFO nova.compute.manager [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.787 187643 DEBUG oslo.service.loopingcall [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.787 187643 DEBUG nova.compute.manager [-] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.787 187643 DEBUG nova.network.neutron [-] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 23 11:22:32 compute-0 podman[217787]: 2026-02-23 11:22:32.797073915 +0000 UTC m=+0.115259833 container remove d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.802 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf83540-28fe-466a-80b8-2f740014b6bb]: (4, ('Mon Feb 23 11:22:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e (d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e)\nd9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e\nMon Feb 23 11:22:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e (d9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e)\nd9f8f91a4b86b7430d2388897e4f05f2fa94f5fdd707f919537b45edfaa4434e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.804 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[82f2a379-ac29-4303-bbee-bdb04c48d403]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.806 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84b587ef-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.808 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:32 compute-0 kernel: tap84b587ef-d0: left promiscuous mode
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.813 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.817 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3745a9c5-54a0-4b4f-bb56-38ba4ebc6b1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.835 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0575d0-4997-452b-8f42-5a45fb8c8625]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.836 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[411c0865-b26d-49fc-8053-bff64ae86aab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.853 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f434bc-20eb-4a7a-b8a3-b150f7529298]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484999, 'reachable_time': 41010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217819, 'error': None, 'target': 'ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.857 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-84b587ef-d196-4e10-83df-6c7772bec83e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:22:32 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:22:32.857 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[c69f5144-715b-4e52-97af-470b9ecdde8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:22:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d84b587ef\x2dd196\x2d4e10\x2d83df\x2d6c7772bec83e.mount: Deactivated successfully.
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.906 187643 DEBUG nova.compute.manager [req-430a45bf-0afe-41eb-847b-631692850e6d req-3e8b0f14-8294-4289-8673-e438a9d9106f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Received event network-vif-plugged-1d1fd389-ac11-43ba-9bb5-45c9794c71bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.907 187643 DEBUG oslo_concurrency.lockutils [req-430a45bf-0afe-41eb-847b-631692850e6d req-3e8b0f14-8294-4289-8673-e438a9d9106f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.907 187643 DEBUG oslo_concurrency.lockutils [req-430a45bf-0afe-41eb-847b-631692850e6d req-3e8b0f14-8294-4289-8673-e438a9d9106f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.907 187643 DEBUG oslo_concurrency.lockutils [req-430a45bf-0afe-41eb-847b-631692850e6d req-3e8b0f14-8294-4289-8673-e438a9d9106f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "b71bc235-14d4-46a5-8f6d-bd5dc25af5a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.908 187643 DEBUG nova.compute.manager [req-430a45bf-0afe-41eb-847b-631692850e6d req-3e8b0f14-8294-4289-8673-e438a9d9106f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] No waiting events found dispatching network-vif-plugged-1d1fd389-ac11-43ba-9bb5-45c9794c71bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:22:32 compute-0 nova_compute[187639]: 2026-02-23 11:22:32.908 187643 WARNING nova.compute.manager [req-430a45bf-0afe-41eb-847b-631692850e6d req-3e8b0f14-8294-4289-8673-e438a9d9106f 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Received unexpected event network-vif-plugged-1d1fd389-ac11-43ba-9bb5-45c9794c71bc for instance with vm_state deleted and task_state None.
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.505 187643 DEBUG nova.network.neutron [-] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.520 187643 INFO nova.compute.manager [-] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Took 0.73 seconds to deallocate network for instance.
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.581 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.582 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.635 187643 DEBUG nova.compute.provider_tree [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.655 187643 DEBUG nova.scheduler.client.report [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.676 187643 DEBUG nova.compute.manager [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received event network-vif-unplugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.677 187643 DEBUG oslo_concurrency.lockutils [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.677 187643 DEBUG oslo_concurrency.lockutils [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.677 187643 DEBUG oslo_concurrency.lockutils [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.677 187643 DEBUG nova.compute.manager [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] No waiting events found dispatching network-vif-unplugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.678 187643 WARNING nova.compute.manager [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received unexpected event network-vif-unplugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 for instance with vm_state deleted and task_state None.
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.678 187643 DEBUG nova.compute.manager [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received event network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.678 187643 DEBUG oslo_concurrency.lockutils [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.678 187643 DEBUG oslo_concurrency.lockutils [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.679 187643 DEBUG oslo_concurrency.lockutils [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.679 187643 DEBUG nova.compute.manager [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] No waiting events found dispatching network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.679 187643 WARNING nova.compute.manager [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received unexpected event network-vif-plugged-36da1c62-1081-4041-ac32-4925d6a4ecb8 for instance with vm_state deleted and task_state None.
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.680 187643 DEBUG nova.compute.manager [req-b8e94132-38fc-4813-8174-34fd0fc8609f req-7481a2b3-b29c-464c-950e-36489f7ef8e4 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Received event network-vif-deleted-36da1c62-1081-4041-ac32-4925d6a4ecb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.682 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.712 187643 INFO nova.scheduler.client.report [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Deleted allocations for instance 1e192505-df0b-49ed-8cf3-a77e144e8ab3
Feb 23 11:22:33 compute-0 nova_compute[187639]: 2026-02-23 11:22:33.780 187643 DEBUG oslo_concurrency.lockutils [None req-09482037-437a-4b74-b8dc-1333995e7f21 1cc69df7c7464f81ae1446f3587ebd7e 9f82b70ddbd84b29baad3bb3a8bc340d - - default default] Lock "1e192505-df0b-49ed-8cf3-a77e144e8ab3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:22:34 compute-0 nova_compute[187639]: 2026-02-23 11:22:34.997 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:35 compute-0 nova_compute[187639]: 2026-02-23 11:22:35.316 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:22:37 compute-0 nova_compute[187639]: 2026-02-23 11:22:37.730 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:37 compute-0 podman[217820]: 2026-02-23 11:22:37.843767504 +0000 UTC m=+0.050350100 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:22:40 compute-0 nova_compute[187639]: 2026-02-23 11:22:39.999 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:42 compute-0 nova_compute[187639]: 2026-02-23 11:22:42.732 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:43 compute-0 podman[217845]: 2026-02-23 11:22:43.850702296 +0000 UTC m=+0.055246239 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 11:22:45 compute-0 nova_compute[187639]: 2026-02-23 11:22:45.040 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:45 compute-0 nova_compute[187639]: 2026-02-23 11:22:45.750 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845750.7488778, b71bc235-14d4-46a5-8f6d-bd5dc25af5a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:22:45 compute-0 nova_compute[187639]: 2026-02-23 11:22:45.751 187643 INFO nova.compute.manager [-] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] VM Stopped (Lifecycle Event)
Feb 23 11:22:45 compute-0 nova_compute[187639]: 2026-02-23 11:22:45.776 187643 DEBUG nova.compute.manager [None req-148d3fc2-218f-42ea-85be-3e54d1069766 - - - - - -] [instance: b71bc235-14d4-46a5-8f6d-bd5dc25af5a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:22:47 compute-0 nova_compute[187639]: 2026-02-23 11:22:47.709 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845752.7072353, 1e192505-df0b-49ed-8cf3-a77e144e8ab3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:22:47 compute-0 nova_compute[187639]: 2026-02-23 11:22:47.709 187643 INFO nova.compute.manager [-] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] VM Stopped (Lifecycle Event)
Feb 23 11:22:47 compute-0 nova_compute[187639]: 2026-02-23 11:22:47.733 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:47 compute-0 nova_compute[187639]: 2026-02-23 11:22:47.737 187643 DEBUG nova.compute.manager [None req-ce86ce80-aa0d-4770-b94a-043cfa7854b7 - - - - - -] [instance: 1e192505-df0b-49ed-8cf3-a77e144e8ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:22:50 compute-0 nova_compute[187639]: 2026-02-23 11:22:50.094 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:52 compute-0 nova_compute[187639]: 2026-02-23 11:22:52.735 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:53 compute-0 sshd-session[217866]: Invalid user admin from 165.227.79.48 port 53022
Feb 23 11:22:53 compute-0 sshd-session[217866]: Connection closed by invalid user admin 165.227.79.48 port 53022 [preauth]
Feb 23 11:22:54 compute-0 podman[217868]: 2026-02-23 11:22:54.888582561 +0000 UTC m=+0.086433013 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 11:22:55 compute-0 nova_compute[187639]: 2026-02-23 11:22:55.136 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:57 compute-0 nova_compute[187639]: 2026-02-23 11:22:57.736 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:22:57 compute-0 podman[217896]: 2026-02-23 11:22:57.836592122 +0000 UTC m=+0.044776673 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal)
Feb 23 11:22:59 compute-0 podman[197002]: time="2026-02-23T11:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:22:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:22:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 23 11:23:00 compute-0 nova_compute[187639]: 2026-02-23 11:23:00.141 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:01 compute-0 openstack_network_exporter[199919]: ERROR   11:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:23:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:23:01 compute-0 openstack_network_exporter[199919]: ERROR   11:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:23:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:23:02 compute-0 sshd-session[217918]: Invalid user user from 143.198.30.3 port 47034
Feb 23 11:23:02 compute-0 sshd-session[217918]: Connection closed by invalid user user 143.198.30.3 port 47034 [preauth]
Feb 23 11:23:02 compute-0 nova_compute[187639]: 2026-02-23 11:23:02.737 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:03 compute-0 ovn_controller[97601]: 2026-02-23T11:23:03Z|00218|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 23 11:23:03 compute-0 nova_compute[187639]: 2026-02-23 11:23:03.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:03 compute-0 nova_compute[187639]: 2026-02-23 11:23:03.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 11:23:03 compute-0 nova_compute[187639]: 2026-02-23 11:23:03.709 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 11:23:05 compute-0 nova_compute[187639]: 2026-02-23 11:23:05.140 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:07 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:07.256 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:23:07 compute-0 nova_compute[187639]: 2026-02-23 11:23:07.256 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:07 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:07.257 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:23:07 compute-0 nova_compute[187639]: 2026-02-23 11:23:07.739 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:08 compute-0 podman[217920]: 2026-02-23 11:23:08.8435291 +0000 UTC m=+0.047521525 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 11:23:09 compute-0 nova_compute[187639]: 2026-02-23 11:23:09.341 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:10 compute-0 nova_compute[187639]: 2026-02-23 11:23:10.173 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:12.668 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:12.668 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:12.668 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:12 compute-0 nova_compute[187639]: 2026-02-23 11:23:12.708 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:12 compute-0 nova_compute[187639]: 2026-02-23 11:23:12.740 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:14 compute-0 nova_compute[187639]: 2026-02-23 11:23:14.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:14 compute-0 podman[217945]: 2026-02-23 11:23:14.843430655 +0000 UTC m=+0.051079549 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 11:23:15 compute-0 nova_compute[187639]: 2026-02-23 11:23:15.219 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:15 compute-0 sshd-session[217638]: Invalid user admin from 154.58.233.195 port 28111
Feb 23 11:23:16 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:16.259 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:23:16 compute-0 sshd-session[217638]: Connection closed by invalid user admin 154.58.233.195 port 28111 [preauth]
Feb 23 11:23:17 compute-0 nova_compute[187639]: 2026-02-23 11:23:17.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:17 compute-0 nova_compute[187639]: 2026-02-23 11:23:17.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:23:17 compute-0 nova_compute[187639]: 2026-02-23 11:23:17.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:23:17 compute-0 nova_compute[187639]: 2026-02-23 11:23:17.736 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:23:17 compute-0 nova_compute[187639]: 2026-02-23 11:23:17.736 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:17 compute-0 nova_compute[187639]: 2026-02-23 11:23:17.736 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:23:17 compute-0 nova_compute[187639]: 2026-02-23 11:23:17.741 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:18 compute-0 nova_compute[187639]: 2026-02-23 11:23:18.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:19 compute-0 nova_compute[187639]: 2026-02-23 11:23:19.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:20 compute-0 nova_compute[187639]: 2026-02-23 11:23:20.252 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:21 compute-0 nova_compute[187639]: 2026-02-23 11:23:21.693 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:22 compute-0 nova_compute[187639]: 2026-02-23 11:23:22.700 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:22 compute-0 nova_compute[187639]: 2026-02-23 11:23:22.798 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.292 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.699 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.731 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.732 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.733 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.733 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:23:25 compute-0 podman[217964]: 2026-02-23 11:23:25.901398289 +0000 UTC m=+0.129437387 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.911 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.913 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5849MB free_disk=73.20450592041016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.914 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.914 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.977 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:23:25 compute-0 nova_compute[187639]: 2026-02-23 11:23:25.977 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:23:26 compute-0 nova_compute[187639]: 2026-02-23 11:23:26.152 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:23:26 compute-0 nova_compute[187639]: 2026-02-23 11:23:26.165 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:23:26 compute-0 nova_compute[187639]: 2026-02-23 11:23:26.184 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:23:26 compute-0 nova_compute[187639]: 2026-02-23 11:23:26.184 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:27 compute-0 nova_compute[187639]: 2026-02-23 11:23:27.803 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:28 compute-0 podman[217991]: 2026-02-23 11:23:28.852357019 +0000 UTC m=+0.056354349 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1770267347, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_id=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z)
Feb 23 11:23:29 compute-0 podman[197002]: time="2026-02-23T11:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:23:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:23:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 23 11:23:30 compute-0 nova_compute[187639]: 2026-02-23 11:23:30.343 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:31 compute-0 openstack_network_exporter[199919]: ERROR   11:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:23:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:23:31 compute-0 openstack_network_exporter[199919]: ERROR   11:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:23:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:23:32 compute-0 nova_compute[187639]: 2026-02-23 11:23:32.805 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:32 compute-0 sshd-session[218012]: Invalid user user from 143.198.30.3 port 54492
Feb 23 11:23:32 compute-0 sshd-session[218012]: Connection closed by invalid user user 143.198.30.3 port 54492 [preauth]
Feb 23 11:23:35 compute-0 nova_compute[187639]: 2026-02-23 11:23:35.406 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:36 compute-0 sshd-session[218014]: Invalid user admin from 165.227.79.48 port 57042
Feb 23 11:23:36 compute-0 sshd-session[218014]: Connection closed by invalid user admin 165.227.79.48 port 57042 [preauth]
Feb 23 11:23:37 compute-0 nova_compute[187639]: 2026-02-23 11:23:37.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:37 compute-0 nova_compute[187639]: 2026-02-23 11:23:37.866 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:39 compute-0 podman[218016]: 2026-02-23 11:23:39.834528053 +0000 UTC m=+0.041824935 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 11:23:40 compute-0 nova_compute[187639]: 2026-02-23 11:23:40.447 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:42 compute-0 nova_compute[187639]: 2026-02-23 11:23:42.709 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:23:42 compute-0 nova_compute[187639]: 2026-02-23 11:23:42.709 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 11:23:42 compute-0 nova_compute[187639]: 2026-02-23 11:23:42.924 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.341 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.342 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.360 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.447 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.448 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.456 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.457 187643 INFO nova.compute.claims [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Claim successful on node compute-0.ctlplane.example.com
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.567 187643 DEBUG nova.compute.provider_tree [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.585 187643 DEBUG nova.scheduler.client.report [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.603 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.603 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.658 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.658 187643 DEBUG nova.network.neutron [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.681 187643 INFO nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.700 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.811 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.812 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.813 187643 INFO nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Creating image(s)
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.813 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquiring lock "/var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.814 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "/var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.814 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "/var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.826 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.858 187643 DEBUG nova.policy [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c83e20651b24489da73abd4b530fb971', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12cbe83eae144acf8af44a253e12c69e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.898 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.899 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquiring lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.900 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.910 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.956 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.957 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.981 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29,backing_fmt=raw /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.981 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:43 compute-0 nova_compute[187639]: 2026-02-23 11:23:43.982 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.033 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e36ab0395c52fe7fa1c5689cf1bfacc1926b9c29 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.034 187643 DEBUG nova.virt.disk.api [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Checking if we can resize image /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.034 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.092 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.093 187643 DEBUG nova.virt.disk.api [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Cannot resize image /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.093 187643 DEBUG nova.objects.instance [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lazy-loading 'migration_context' on Instance uuid 194d7361-7d98-4642-8052-446791060f8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.108 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.108 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Ensure instance console log exists: /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.109 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.109 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.109 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:44 compute-0 nova_compute[187639]: 2026-02-23 11:23:44.414 187643 DEBUG nova.network.neutron [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Successfully created port: b070b4ba-b956-47f5-be40-6adbf98ac276 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.331 187643 DEBUG nova.network.neutron [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Successfully updated port: b070b4ba-b956-47f5-be40-6adbf98ac276 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.349 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquiring lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.350 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquired lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.350 187643 DEBUG nova.network.neutron [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.463 187643 DEBUG nova.compute.manager [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-changed-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.463 187643 DEBUG nova.compute.manager [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Refreshing instance network info cache due to event network-changed-b070b4ba-b956-47f5-be40-6adbf98ac276. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.464 187643 DEBUG oslo_concurrency.lockutils [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.489 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:45 compute-0 nova_compute[187639]: 2026-02-23 11:23:45.510 187643 DEBUG nova.network.neutron [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 23 11:23:45 compute-0 podman[218055]: 2026-02-23 11:23:45.838580136 +0000 UTC m=+0.047663369 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 11:23:45 compute-0 ovn_controller[97601]: 2026-02-23T11:23:45Z|00219|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.286 187643 DEBUG nova.network.neutron [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updating instance_info_cache with network_info: [{"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.312 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Releasing lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.312 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Instance network_info: |[{"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.312 187643 DEBUG oslo_concurrency.lockutils [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.313 187643 DEBUG nova.network.neutron [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Refreshing network info cache for port b070b4ba-b956-47f5-be40-6adbf98ac276 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.316 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Start _get_guest_xml network_info=[{"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'image_id': '0ef805b1-b4a6-4839-ade3-d18a6c4b570e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.320 187643 WARNING nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.327 187643 DEBUG nova.virt.libvirt.host [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.327 187643 DEBUG nova.virt.libvirt.host [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.330 187643 DEBUG nova.virt.libvirt.host [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.331 187643 DEBUG nova.virt.libvirt.host [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.332 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.333 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T10:51:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8897af22-0b9b-4aa0-b3ec-ef8f2b8d73f5',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-23T10:51:56Z,direct_url=<?>,disk_format='qcow2',id=0ef805b1-b4a6-4839-ade3-d18a6c4b570e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c68d4cfcc5c84067ae596f11bca9a9e4',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-23T10:51:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.333 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.333 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.334 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.334 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.334 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.335 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.335 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.336 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.336 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.336 187643 DEBUG nova.virt.hardware [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.341 187643 DEBUG nova.virt.libvirt.vif [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:23:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-426222971',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-426222971',id=29,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12cbe83eae144acf8af44a253e12c69e',ramdisk_id='',reservation_id='r-ynmpvtiq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-777871079',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-777871079-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:23:43Z,user_data=None,user_id='c83e20651b24489da73abd4b530fb971',uuid=194d7361-7d98-4642-8052-446791060f8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.341 187643 DEBUG nova.network.os_vif_util [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Converting VIF {"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.342 187643 DEBUG nova.network.os_vif_util [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.343 187643 DEBUG nova.objects.instance [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lazy-loading 'pci_devices' on Instance uuid 194d7361-7d98-4642-8052-446791060f8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.359 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] End _get_guest_xml xml=<domain type="kvm">
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <uuid>194d7361-7d98-4642-8052-446791060f8a</uuid>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <name>instance-0000001d</name>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <memory>131072</memory>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <vcpu>1</vcpu>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <metadata>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-426222971</nova:name>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <nova:creationTime>2026-02-23 11:23:46</nova:creationTime>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <nova:flavor name="m1.nano">
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:memory>128</nova:memory>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:disk>1</nova:disk>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:swap>0</nova:swap>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:ephemeral>0</nova:ephemeral>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:vcpus>1</nova:vcpus>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       </nova:flavor>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <nova:owner>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:user uuid="c83e20651b24489da73abd4b530fb971">tempest-TestExecuteZoneMigrationStrategy-777871079-project-member</nova:user>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:project uuid="12cbe83eae144acf8af44a253e12c69e">tempest-TestExecuteZoneMigrationStrategy-777871079</nova:project>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       </nova:owner>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <nova:root type="image" uuid="0ef805b1-b4a6-4839-ade3-d18a6c4b570e"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <nova:ports>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         <nova:port uuid="b070b4ba-b956-47f5-be40-6adbf98ac276">
Feb 23 11:23:46 compute-0 nova_compute[187639]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:         </nova:port>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       </nova:ports>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </nova:instance>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   </metadata>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <sysinfo type="smbios">
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <system>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <entry name="manufacturer">RDO</entry>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <entry name="product">OpenStack Compute</entry>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <entry name="serial">194d7361-7d98-4642-8052-446791060f8a</entry>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <entry name="uuid">194d7361-7d98-4642-8052-446791060f8a</entry>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <entry name="family">Virtual Machine</entry>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </system>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   </sysinfo>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <os>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <boot dev="hd"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <smbios mode="sysinfo"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   </os>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <features>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <acpi/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <apic/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <vmcoreinfo/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   </features>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <clock offset="utc">
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <timer name="hpet" present="no"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   </clock>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <cpu mode="custom" match="exact">
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <model>Nehalem</model>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   </cpu>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   <devices>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <disk type="file" device="disk">
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <target dev="vda" bus="virtio"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <disk type="file" device="cdrom">
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <driver name="qemu" type="raw" cache="none"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <source file="/var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk.config"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <target dev="sda" bus="sata"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </disk>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <interface type="ethernet">
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <mac address="fa:16:3e:ee:cf:53"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <mtu size="1442"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <target dev="tapb070b4ba-b9"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </interface>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <serial type="pty">
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <log file="/var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/console.log" append="off"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </serial>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <video>
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <model type="virtio"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </video>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <input type="tablet" bus="usb"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <rng model="virtio">
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <backend model="random">/dev/urandom</backend>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </rng>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <controller type="usb" index="0"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     <memballoon model="virtio">
Feb 23 11:23:46 compute-0 nova_compute[187639]:       <stats period="10"/>
Feb 23 11:23:46 compute-0 nova_compute[187639]:     </memballoon>
Feb 23 11:23:46 compute-0 nova_compute[187639]:   </devices>
Feb 23 11:23:46 compute-0 nova_compute[187639]: </domain>
Feb 23 11:23:46 compute-0 nova_compute[187639]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.360 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Preparing to wait for external event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.360 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.360 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.360 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.361 187643 DEBUG nova.virt.libvirt.vif [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T11:23:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-426222971',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-426222971',id=29,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12cbe83eae144acf8af44a253e12c69e',ramdisk_id='',reservation_id='r-ynmpvtiq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-777871079',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-777871079-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-23T11:23:43Z,user_data=None,user_id='c83e20651b24489da73abd4b530fb971',uuid=194d7361-7d98-4642-8052-446791060f8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.361 187643 DEBUG nova.network.os_vif_util [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Converting VIF {"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.362 187643 DEBUG nova.network.os_vif_util [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.362 187643 DEBUG os_vif [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.363 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.363 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.364 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.366 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.367 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb070b4ba-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.367 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb070b4ba-b9, col_values=(('external_ids', {'iface-id': 'b070b4ba-b956-47f5-be40-6adbf98ac276', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:cf:53', 'vm-uuid': '194d7361-7d98-4642-8052-446791060f8a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:23:46 compute-0 NetworkManager[57207]: <info>  [1771845826.3701] manager: (tapb070b4ba-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.370 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.373 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.374 187643 INFO os_vif [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9')
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.426 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.427 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.427 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] No VIF found with MAC fa:16:3e:ee:cf:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 23 11:23:46 compute-0 nova_compute[187639]: 2026-02-23 11:23:46.428 187643 INFO nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Using config drive
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.768 187643 INFO nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Creating config drive at /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk.config
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.772 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpo47uqwq7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.888 187643 DEBUG oslo_concurrency.processutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpo47uqwq7" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:23:48 compute-0 kernel: tapb070b4ba-b9: entered promiscuous mode
Feb 23 11:23:48 compute-0 NetworkManager[57207]: <info>  [1771845828.9358] manager: (tapb070b4ba-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.938 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:48 compute-0 ovn_controller[97601]: 2026-02-23T11:23:48Z|00220|binding|INFO|Claiming lport b070b4ba-b956-47f5-be40-6adbf98ac276 for this chassis.
Feb 23 11:23:48 compute-0 ovn_controller[97601]: 2026-02-23T11:23:48Z|00221|binding|INFO|b070b4ba-b956-47f5-be40-6adbf98ac276: Claiming fa:16:3e:ee:cf:53 10.100.0.10
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.939 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.942 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.944 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.955 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:cf:53 10.100.0.10'], port_security=['fa:16:3e:ee:cf:53 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '194d7361-7d98-4642-8052-446791060f8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44df174f-c509-4e62-8ffe-50240bf4471c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12cbe83eae144acf8af44a253e12c69e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '273497ec-a061-489f-a448-7e8beae43d89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4c66e88-7a15-44e6-bbcb-ddfcf149dd37, chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b070b4ba-b956-47f5-be40-6adbf98ac276) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.956 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b070b4ba-b956-47f5-be40-6adbf98ac276 in datapath 44df174f-c509-4e62-8ffe-50240bf4471c bound to our chassis
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.958 106968 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44df174f-c509-4e62-8ffe-50240bf4471c
Feb 23 11:23:48 compute-0 ovn_controller[97601]: 2026-02-23T11:23:48Z|00222|binding|INFO|Setting lport b070b4ba-b956-47f5-be40-6adbf98ac276 ovn-installed in OVS
Feb 23 11:23:48 compute-0 ovn_controller[97601]: 2026-02-23T11:23:48Z|00223|binding|INFO|Setting lport b070b4ba-b956-47f5-be40-6adbf98ac276 up in Southbound
Feb 23 11:23:48 compute-0 systemd-udevd[218095]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:23:48 compute-0 nova_compute[187639]: 2026-02-23 11:23:48.961 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:48 compute-0 systemd-machined[156970]: New machine qemu-21-instance-0000001d.
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.965 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8c7aadee-7096-479e-9d84-e60491a924f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.966 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44df174f-c1 in ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.968 208487 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44df174f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.968 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[885b533e-8600-4e6b-a34e-a9748043dfde]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.969 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[8eac49d8-d814-4a91-ac0e-52bd068e9304]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:48 compute-0 NetworkManager[57207]: <info>  [1771845828.9714] device (tapb070b4ba-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 23 11:23:48 compute-0 NetworkManager[57207]: <info>  [1771845828.9718] device (tapb070b4ba-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.975 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[8136fb87-3d34-4cac-bfd4-ebe1c3331192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:48 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001d.
Feb 23 11:23:48 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:48.995 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4aa615-ff88-42c9-9ece-b7ffb4e080c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.016 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[d81ff937-e9c5-42be-948b-3de4a2a69747]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 systemd-udevd[218098]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 11:23:49 compute-0 NetworkManager[57207]: <info>  [1771845829.0208] manager: (tap44df174f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.020 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[515dd13c-103d-4e05-9ed7-0421327170b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.043 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[f728d2af-c7b1-4181-8135-f618dd1b18c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.047 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[db26d5b8-8540-4aeb-8e34-3f3104ba64b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 NetworkManager[57207]: <info>  [1771845829.0612] device (tap44df174f-c0): carrier: link connected
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.063 208501 DEBUG oslo.privsep.daemon [-] privsep: reply[49e7f042-deb1-467c-9650-e7f5458372b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.073 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[80a613e5-49e2-4432-a3b4-c67e18cd4700]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44df174f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:42:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497322, 'reachable_time': 42237, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218128, 'error': None, 'target': 'ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.080 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc7cbb1-55f7-4da6-940e-bf81ceab6ebb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:42a2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497322, 'tstamp': 497322}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218129, 'error': None, 'target': 'ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.090 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[2a5752bf-277f-450f-94ce-f88972d48909]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44df174f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:42:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497322, 'reachable_time': 42237, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218130, 'error': None, 'target': 'ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.111 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[4299c8df-e9b9-477e-8402-ef8c1a5a8dc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.146 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d2831c-bba9-4c67-84f5-b3b6c37ee679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.148 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44df174f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.148 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.148 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44df174f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:23:49 compute-0 NetworkManager[57207]: <info>  [1771845829.1512] manager: (tap44df174f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Feb 23 11:23:49 compute-0 kernel: tap44df174f-c0: entered promiscuous mode
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.152 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.154 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44df174f-c0, col_values=(('external_ids', {'iface-id': 'ef93db81-02ee-4949-881a-2ea5c8916496'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:23:49 compute-0 ovn_controller[97601]: 2026-02-23T11:23:49Z|00224|binding|INFO|Releasing lport ef93db81-02ee-4949-881a-2ea5c8916496 from this chassis (sb_readonly=0)
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.162 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.164 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.164 106968 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44df174f-c509-4e62-8ffe-50240bf4471c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44df174f-c509-4e62-8ffe-50240bf4471c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.165 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[66b958d0-62de-4bde-9d0d-d5caca17998e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.166 106968 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: global
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     log         /dev/log local0 debug
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     log-tag     haproxy-metadata-proxy-44df174f-c509-4e62-8ffe-50240bf4471c
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     user        root
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     group       root
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     maxconn     1024
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     pidfile     /var/lib/neutron/external/pids/44df174f-c509-4e62-8ffe-50240bf4471c.pid.haproxy
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     daemon
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: defaults
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     log global
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     mode http
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     option httplog
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     option dontlognull
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     option http-server-close
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     option forwardfor
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     retries                 3
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     timeout http-request    30s
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     timeout connect         30s
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     timeout client          32s
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     timeout server          32s
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     timeout http-keep-alive 30s
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: listen listener
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     bind 169.254.169.254:80
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:     http-request add-header X-OVN-Network-ID 44df174f-c509-4e62-8ffe-50240bf4471c
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 11:23:49 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:23:49.166 106968 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c', 'env', 'PROCESS_TAG=haproxy-44df174f-c509-4e62-8ffe-50240bf4471c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44df174f-c509-4e62-8ffe-50240bf4471c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 11:23:49 compute-0 podman[218162]: 2026-02-23 11:23:49.494173822 +0000 UTC m=+0.072548325 container create 01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 11:23:49 compute-0 systemd[1]: Started libpod-conmon-01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873.scope.
Feb 23 11:23:49 compute-0 podman[218162]: 2026-02-23 11:23:49.453080778 +0000 UTC m=+0.031455311 image pull 76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 11:23:49 compute-0 systemd[1]: Started libcrun container.
Feb 23 11:23:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92b24f56c3fa44a166db8a9438db716d6bc43a8c5239d94dac85254d8ffedbab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 11:23:49 compute-0 podman[218162]: 2026-02-23 11:23:49.564835407 +0000 UTC m=+0.143209920 container init 01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 11:23:49 compute-0 podman[218162]: 2026-02-23 11:23:49.569265874 +0000 UTC m=+0.147640377 container start 01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 11:23:49 compute-0 neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c[218177]: [NOTICE]   (218181) : New worker (218183) forked
Feb 23 11:23:49 compute-0 neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c[218177]: [NOTICE]   (218181) : Loading success.
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.908 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845829.9084053, 194d7361-7d98-4642-8052-446791060f8a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.909 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] VM Started (Lifecycle Event)
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.928 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.932 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845829.9118116, 194d7361-7d98-4642-8052-446791060f8a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.933 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] VM Paused (Lifecycle Event)
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.953 187643 DEBUG nova.compute.manager [req-11fb9f32-3f50-43ef-8830-72a44fca97bb req-95b3faae-275f-41fe-a0b4-b425e6f4820e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.953 187643 DEBUG oslo_concurrency.lockutils [req-11fb9f32-3f50-43ef-8830-72a44fca97bb req-95b3faae-275f-41fe-a0b4-b425e6f4820e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.954 187643 DEBUG oslo_concurrency.lockutils [req-11fb9f32-3f50-43ef-8830-72a44fca97bb req-95b3faae-275f-41fe-a0b4-b425e6f4820e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.954 187643 DEBUG oslo_concurrency.lockutils [req-11fb9f32-3f50-43ef-8830-72a44fca97bb req-95b3faae-275f-41fe-a0b4-b425e6f4820e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.954 187643 DEBUG nova.compute.manager [req-11fb9f32-3f50-43ef-8830-72a44fca97bb req-95b3faae-275f-41fe-a0b4-b425e6f4820e 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Processing event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.955 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.956 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.958 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.960 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845829.9576004, 194d7361-7d98-4642-8052-446791060f8a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.960 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] VM Resumed (Lifecycle Event)
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.963 187643 INFO nova.virt.libvirt.driver [-] [instance: 194d7361-7d98-4642-8052-446791060f8a] Instance spawned successfully.
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.963 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.984 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:23:49 compute-0 nova_compute[187639]: 2026-02-23 11:23:49.991 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.000 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.000 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.001 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.001 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.001 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.002 187643 DEBUG nova.virt.libvirt.driver [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.029 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.060 187643 INFO nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Took 6.25 seconds to spawn the instance on the hypervisor.
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.060 187643 DEBUG nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.109 187643 INFO nova.compute.manager [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Took 6.70 seconds to build instance.
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.141 187643 DEBUG oslo_concurrency.lockutils [None req-179deba1-e2bb-4ce0-aa0e-188549461533 c83e20651b24489da73abd4b530fb971 12cbe83eae144acf8af44a253e12c69e - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.489 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.769 187643 DEBUG nova.network.neutron [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updated VIF entry in instance network info cache for port b070b4ba-b956-47f5-be40-6adbf98ac276. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.770 187643 DEBUG nova.network.neutron [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updating instance_info_cache with network_info: [{"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:23:50 compute-0 nova_compute[187639]: 2026-02-23 11:23:50.794 187643 DEBUG oslo_concurrency.lockutils [req-7eb6e68a-b6d9-4c36-809a-4f023047c442 req-81ff1f28-b285-4236-825f-1001f2e108a0 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:23:51 compute-0 nova_compute[187639]: 2026-02-23 11:23:51.369 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:52 compute-0 nova_compute[187639]: 2026-02-23 11:23:52.036 187643 DEBUG nova.compute.manager [req-4a380b41-b410-4f93-8ca0-8e471b4a7f47 req-b408d1e0-6972-4d60-94c0-f24dd43f0c7d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:23:52 compute-0 nova_compute[187639]: 2026-02-23 11:23:52.037 187643 DEBUG oslo_concurrency.lockutils [req-4a380b41-b410-4f93-8ca0-8e471b4a7f47 req-b408d1e0-6972-4d60-94c0-f24dd43f0c7d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:23:52 compute-0 nova_compute[187639]: 2026-02-23 11:23:52.037 187643 DEBUG oslo_concurrency.lockutils [req-4a380b41-b410-4f93-8ca0-8e471b4a7f47 req-b408d1e0-6972-4d60-94c0-f24dd43f0c7d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:23:52 compute-0 nova_compute[187639]: 2026-02-23 11:23:52.038 187643 DEBUG oslo_concurrency.lockutils [req-4a380b41-b410-4f93-8ca0-8e471b4a7f47 req-b408d1e0-6972-4d60-94c0-f24dd43f0c7d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:23:52 compute-0 nova_compute[187639]: 2026-02-23 11:23:52.038 187643 DEBUG nova.compute.manager [req-4a380b41-b410-4f93-8ca0-8e471b4a7f47 req-b408d1e0-6972-4d60-94c0-f24dd43f0c7d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] No waiting events found dispatching network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:23:52 compute-0 nova_compute[187639]: 2026-02-23 11:23:52.038 187643 WARNING nova.compute.manager [req-4a380b41-b410-4f93-8ca0-8e471b4a7f47 req-b408d1e0-6972-4d60-94c0-f24dd43f0c7d 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received unexpected event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 for instance with vm_state active and task_state None.
Feb 23 11:23:55 compute-0 nova_compute[187639]: 2026-02-23 11:23:55.530 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:56 compute-0 nova_compute[187639]: 2026-02-23 11:23:56.376 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:23:56 compute-0 podman[218200]: 2026-02-23 11:23:56.868287395 +0000 UTC m=+0.068741785 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 11:23:59 compute-0 podman[197002]: time="2026-02-23T11:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:23:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 23 11:23:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 23 11:23:59 compute-0 podman[218227]: 2026-02-23 11:23:59.848115186 +0000 UTC m=+0.054881939 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container)
Feb 23 11:24:00 compute-0 nova_compute[187639]: 2026-02-23 11:24:00.534 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:01 compute-0 nova_compute[187639]: 2026-02-23 11:24:01.378 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:01 compute-0 openstack_network_exporter[199919]: ERROR   11:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:24:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:24:01 compute-0 openstack_network_exporter[199919]: ERROR   11:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:24:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:24:01 compute-0 ovn_controller[97601]: 2026-02-23T11:24:01Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:cf:53 10.100.0.10
Feb 23 11:24:01 compute-0 ovn_controller[97601]: 2026-02-23T11:24:01Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:cf:53 10.100.0.10
Feb 23 11:24:04 compute-0 sshd-session[218268]: Invalid user user from 143.198.30.3 port 56988
Feb 23 11:24:04 compute-0 sshd-session[218268]: Connection closed by invalid user user 143.198.30.3 port 56988 [preauth]
Feb 23 11:24:05 compute-0 nova_compute[187639]: 2026-02-23 11:24:05.537 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:06 compute-0 nova_compute[187639]: 2026-02-23 11:24:06.380 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:10 compute-0 nova_compute[187639]: 2026-02-23 11:24:10.582 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:10 compute-0 podman[218270]: 2026-02-23 11:24:10.846717384 +0000 UTC m=+0.051720516 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 11:24:11 compute-0 nova_compute[187639]: 2026-02-23 11:24:11.383 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:12.669 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:12.670 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:12.670 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:13 compute-0 nova_compute[187639]: 2026-02-23 11:24:13.710 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:14 compute-0 nova_compute[187639]: 2026-02-23 11:24:14.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:15 compute-0 nova_compute[187639]: 2026-02-23 11:24:15.584 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:16 compute-0 nova_compute[187639]: 2026-02-23 11:24:16.385 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:16 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 23 11:24:16 compute-0 podman[218296]: 2026-02-23 11:24:16.55942241 +0000 UTC m=+0.044875965 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 11:24:17 compute-0 nova_compute[187639]: 2026-02-23 11:24:17.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:17 compute-0 nova_compute[187639]: 2026-02-23 11:24:17.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:24:17 compute-0 nova_compute[187639]: 2026-02-23 11:24:17.693 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:24:17 compute-0 nova_compute[187639]: 2026-02-23 11:24:17.953 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:24:17 compute-0 nova_compute[187639]: 2026-02-23 11:24:17.953 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquired lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:24:17 compute-0 nova_compute[187639]: 2026-02-23 11:24:17.954 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 11:24:17 compute-0 nova_compute[187639]: 2026-02-23 11:24:17.954 187643 DEBUG nova.objects.instance [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 194d7361-7d98-4642-8052-446791060f8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:24:19 compute-0 ovn_controller[97601]: 2026-02-23T11:24:19Z|00225|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.081 187643 DEBUG nova.network.neutron [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updating instance_info_cache with network_info: [{"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.099 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Releasing lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.099 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.100 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.100 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.100 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.100 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.267 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Check if temp file /var/lib/nova/instances/tmpfhjw17gn exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.268 187643 DEBUG nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfhjw17gn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='194d7361-7d98-4642-8052-446791060f8a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 23 11:24:20 compute-0 sshd-session[218318]: Invalid user admin from 165.227.79.48 port 58508
Feb 23 11:24:20 compute-0 sshd-session[218318]: Connection closed by invalid user admin 165.227.79.48 port 58508 [preauth]
Feb 23 11:24:20 compute-0 nova_compute[187639]: 2026-02-23 11:24:20.586 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:21 compute-0 nova_compute[187639]: 2026-02-23 11:24:21.143 187643 DEBUG oslo_concurrency.processutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:24:21 compute-0 nova_compute[187639]: 2026-02-23 11:24:21.192 187643 DEBUG oslo_concurrency.processutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:24:21 compute-0 nova_compute[187639]: 2026-02-23 11:24:21.193 187643 DEBUG oslo_concurrency.processutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:24:21 compute-0 nova_compute[187639]: 2026-02-23 11:24:21.270 187643 DEBUG oslo_concurrency.processutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:24:21 compute-0 nova_compute[187639]: 2026-02-23 11:24:21.388 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:23 compute-0 nova_compute[187639]: 2026-02-23 11:24:23.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:23 compute-0 sshd-session[218326]: Accepted publickey for nova from 192.168.122.101 port 38722 ssh2: ECDSA SHA256:7C3ZXVuOD26JEFnBX8ceFh1ceIeoysV++WtxsvNUNC4
Feb 23 11:24:23 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 23 11:24:23 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 23 11:24:23 compute-0 systemd-logind[808]: New session 45 of user nova.
Feb 23 11:24:23 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 23 11:24:23 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 23 11:24:23 compute-0 systemd[218330]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:24:23 compute-0 systemd[218330]: Queued start job for default target Main User Target.
Feb 23 11:24:23 compute-0 systemd[218330]: Created slice User Application Slice.
Feb 23 11:24:23 compute-0 systemd[218330]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:24:23 compute-0 systemd[218330]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 11:24:23 compute-0 systemd[218330]: Reached target Paths.
Feb 23 11:24:23 compute-0 systemd[218330]: Reached target Timers.
Feb 23 11:24:23 compute-0 systemd[218330]: Starting D-Bus User Message Bus Socket...
Feb 23 11:24:23 compute-0 systemd[218330]: Starting Create User's Volatile Files and Directories...
Feb 23 11:24:23 compute-0 systemd[218330]: Listening on D-Bus User Message Bus Socket.
Feb 23 11:24:23 compute-0 systemd[218330]: Reached target Sockets.
Feb 23 11:24:23 compute-0 systemd[218330]: Finished Create User's Volatile Files and Directories.
Feb 23 11:24:23 compute-0 systemd[218330]: Reached target Basic System.
Feb 23 11:24:23 compute-0 systemd[218330]: Reached target Main User Target.
Feb 23 11:24:23 compute-0 systemd[218330]: Startup finished in 126ms.
Feb 23 11:24:23 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 23 11:24:23 compute-0 systemd[1]: Started Session 45 of User nova.
Feb 23 11:24:23 compute-0 sshd-session[218326]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 23 11:24:23 compute-0 sshd-session[218345]: Received disconnect from 192.168.122.101 port 38722:11: disconnected by user
Feb 23 11:24:23 compute-0 sshd-session[218345]: Disconnected from user nova 192.168.122.101 port 38722
Feb 23 11:24:23 compute-0 sshd-session[218326]: pam_unix(sshd:session): session closed for user nova
Feb 23 11:24:23 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Feb 23 11:24:23 compute-0 systemd-logind[808]: Session 45 logged out. Waiting for processes to exit.
Feb 23 11:24:23 compute-0 systemd-logind[808]: Removed session 45.
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.572 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:24.622 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:24:24 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:24.623 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.793 187643 DEBUG nova.compute.manager [req-afb7d5ba-0b02-4cc5-8b7f-0e45e680ce42 req-63f586cf-f1e2-4e1a-ae64-36c3c6ae5e23 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-unplugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.793 187643 DEBUG oslo_concurrency.lockutils [req-afb7d5ba-0b02-4cc5-8b7f-0e45e680ce42 req-63f586cf-f1e2-4e1a-ae64-36c3c6ae5e23 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.793 187643 DEBUG oslo_concurrency.lockutils [req-afb7d5ba-0b02-4cc5-8b7f-0e45e680ce42 req-63f586cf-f1e2-4e1a-ae64-36c3c6ae5e23 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.793 187643 DEBUG oslo_concurrency.lockutils [req-afb7d5ba-0b02-4cc5-8b7f-0e45e680ce42 req-63f586cf-f1e2-4e1a-ae64-36c3c6ae5e23 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.794 187643 DEBUG nova.compute.manager [req-afb7d5ba-0b02-4cc5-8b7f-0e45e680ce42 req-63f586cf-f1e2-4e1a-ae64-36c3c6ae5e23 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] No waiting events found dispatching network-vif-unplugged-b070b4ba-b956-47f5-be40-6adbf98ac276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.794 187643 DEBUG nova.compute.manager [req-afb7d5ba-0b02-4cc5-8b7f-0e45e680ce42 req-63f586cf-f1e2-4e1a-ae64-36c3c6ae5e23 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-unplugged-b070b4ba-b956-47f5-be40-6adbf98ac276 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.860 187643 INFO nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Took 3.59 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.860 187643 DEBUG nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 11:24:24 compute-0 nova_compute[187639]: 2026-02-23 11:24:24.882 187643 DEBUG nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfhjw17gn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='194d7361-7d98-4642-8052-446791060f8a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(6405d905-5a72-40e4-a625-73383c777b00),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.035 187643 DEBUG nova.objects.instance [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lazy-loading 'migration_context' on Instance uuid 194d7361-7d98-4642-8052-446791060f8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.036 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.037 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.038 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.060 187643 DEBUG nova.virt.libvirt.vif [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:23:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-426222971',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-426222971',id=29,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:23:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='12cbe83eae144acf8af44a253e12c69e',ramdisk_id='',reservation_id='r-ynmpvtiq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-777871079',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-777871079-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:23:50Z,user_data=None,user_id='c83e20651b24489da73abd4b530fb971',uuid=194d7361-7d98-4642-8052-446791060f8a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.061 187643 DEBUG nova.network.os_vif_util [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.062 187643 DEBUG nova.network.os_vif_util [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.064 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updating guest XML with vif config: <interface type="ethernet">
Feb 23 11:24:25 compute-0 nova_compute[187639]:   <mac address="fa:16:3e:ee:cf:53"/>
Feb 23 11:24:25 compute-0 nova_compute[187639]:   <model type="virtio"/>
Feb 23 11:24:25 compute-0 nova_compute[187639]:   <driver name="vhost" rx_queue_size="512"/>
Feb 23 11:24:25 compute-0 nova_compute[187639]:   <mtu size="1442"/>
Feb 23 11:24:25 compute-0 nova_compute[187639]:   <target dev="tapb070b4ba-b9"/>
Feb 23 11:24:25 compute-0 nova_compute[187639]: </interface>
Feb 23 11:24:25 compute-0 nova_compute[187639]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.065 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.540 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.541 187643 INFO nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 23 11:24:25 compute-0 nova_compute[187639]: 2026-02-23 11:24:25.593 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:25 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:25.624 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.384 187643 INFO nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.390 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.582 187643 DEBUG nova.compute.manager [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.583 187643 DEBUG oslo_concurrency.lockutils [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.583 187643 DEBUG oslo_concurrency.lockutils [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.583 187643 DEBUG oslo_concurrency.lockutils [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.584 187643 DEBUG nova.compute.manager [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] No waiting events found dispatching network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.584 187643 WARNING nova.compute.manager [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received unexpected event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 for instance with vm_state active and task_state migrating.
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.585 187643 DEBUG nova.compute.manager [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-changed-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.585 187643 DEBUG nova.compute.manager [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Refreshing instance network info cache due to event network-changed-b070b4ba-b956-47f5-be40-6adbf98ac276. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.586 187643 DEBUG oslo_concurrency.lockutils [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.586 187643 DEBUG oslo_concurrency.lockutils [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquired lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.587 187643 DEBUG nova.network.neutron [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Refreshing network info cache for port b070b4ba-b956-47f5-be40-6adbf98ac276 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.718 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.719 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.719 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.831 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.886 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.887 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.912 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.912 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 11:24:26 compute-0 nova_compute[187639]: 2026-02-23 11:24:26.982 187643 DEBUG oslo_concurrency.processutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.132 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.133 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5666MB free_disk=73.17572784423828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.134 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.134 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.217 187643 INFO nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updating resource usage from migration 6405d905-5a72-40e4-a625-73383c777b00
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.300 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Migration 6405d905-5a72-40e4-a625-73383c777b00 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.301 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.301 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.385 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.390 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.390 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.403 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.450 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.451 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.867 187643 DEBUG nova.network.neutron [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updated VIF entry in instance network info cache for port b070b4ba-b956-47f5-be40-6adbf98ac276. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.868 187643 DEBUG nova.network.neutron [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Updating instance_info_cache with network_info: [{"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.903 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 23 11:24:27 compute-0 nova_compute[187639]: 2026-02-23 11:24:27.903 187643 DEBUG nova.virt.libvirt.migration [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 23 11:24:27 compute-0 podman[218358]: 2026-02-23 11:24:27.946910038 +0000 UTC m=+0.145751287 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.020 187643 DEBUG oslo_concurrency.lockutils [req-4182a532-ce1c-419a-9480-6a7e827376a4 req-30603797-9825-4552-965c-ccca8c41ae7a 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Releasing lock "refresh_cache-194d7361-7d98-4642-8052-446791060f8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.177 187643 DEBUG nova.virt.driver [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] Emitting event <LifecycleEvent: 1771845868.1771574, 194d7361-7d98-4642-8052-446791060f8a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.178 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] VM Paused (Lifecycle Event)
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.193 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.196 187643 DEBUG nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.213 187643 INFO nova.compute.manager [None req-e8ba4618-6245-4173-a481-559539bdb71b - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 23 11:24:28 compute-0 kernel: tapb070b4ba-b9 (unregistering): left promiscuous mode
Feb 23 11:24:28 compute-0 NetworkManager[57207]: <info>  [1771845868.3077] device (tapb070b4ba-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 23 11:24:28 compute-0 ovn_controller[97601]: 2026-02-23T11:24:28Z|00226|binding|INFO|Releasing lport b070b4ba-b956-47f5-be40-6adbf98ac276 from this chassis (sb_readonly=0)
Feb 23 11:24:28 compute-0 ovn_controller[97601]: 2026-02-23T11:24:28Z|00227|binding|INFO|Setting lport b070b4ba-b956-47f5-be40-6adbf98ac276 down in Southbound
Feb 23 11:24:28 compute-0 ovn_controller[97601]: 2026-02-23T11:24:28Z|00228|binding|INFO|Removing iface tapb070b4ba-b9 ovn-installed in OVS
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.313 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.315 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.319 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:cf:53 10.100.0.10'], port_security=['fa:16:3e:ee:cf:53 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '48738a31-ba59-4fc8-acf1-d1f474e97648'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '194d7361-7d98-4642-8052-446791060f8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44df174f-c509-4e62-8ffe-50240bf4471c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12cbe83eae144acf8af44a253e12c69e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '273497ec-a061-489f-a448-7e8beae43d89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4c66e88-7a15-44e6-bbcb-ddfcf149dd37, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>], logical_port=b070b4ba-b956-47f5-be40-6adbf98ac276) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3c56d036d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.321 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.321 106968 INFO neutron.agent.ovn.metadata.agent [-] Port b070b4ba-b956-47f5-be40-6adbf98ac276 in datapath 44df174f-c509-4e62-8ffe-50240bf4471c unbound from our chassis
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.323 106968 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44df174f-c509-4e62-8ffe-50240bf4471c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.325 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9d58f9-e8ab-43f0-a44e-c941c5ef5d06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.326 106968 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c namespace which is not needed anymore
Feb 23 11:24:28 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Feb 23 11:24:28 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Consumed 13.911s CPU time.
Feb 23 11:24:28 compute-0 systemd-machined[156970]: Machine qemu-21-instance-0000001d terminated.
Feb 23 11:24:28 compute-0 neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c[218177]: [NOTICE]   (218181) : haproxy version is 2.8.14-c23fe91
Feb 23 11:24:28 compute-0 neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c[218177]: [NOTICE]   (218181) : path to executable is /usr/sbin/haproxy
Feb 23 11:24:28 compute-0 neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c[218177]: [WARNING]  (218181) : Exiting Master process...
Feb 23 11:24:28 compute-0 neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c[218177]: [ALERT]    (218181) : Current worker (218183) exited with code 143 (Terminated)
Feb 23 11:24:28 compute-0 neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c[218177]: [WARNING]  (218181) : All workers exited. Exiting... (0)
Feb 23 11:24:28 compute-0 systemd[1]: libpod-01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873.scope: Deactivated successfully.
Feb 23 11:24:28 compute-0 podman[218409]: 2026-02-23 11:24:28.467127585 +0000 UTC m=+0.056887715 container died 01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 11:24:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873-userdata-shm.mount: Deactivated successfully.
Feb 23 11:24:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-92b24f56c3fa44a166db8a9438db716d6bc43a8c5239d94dac85254d8ffedbab-merged.mount: Deactivated successfully.
Feb 23 11:24:28 compute-0 podman[218409]: 2026-02-23 11:24:28.505113972 +0000 UTC m=+0.094874082 container cleanup 01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 11:24:28 compute-0 systemd[1]: libpod-conmon-01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873.scope: Deactivated successfully.
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.539 187643 DEBUG nova.virt.libvirt.guest [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.539 187643 INFO nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Migration operation has completed
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.539 187643 INFO nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] _post_live_migration() is started..
Feb 23 11:24:28 compute-0 virtqemud[186733]: Cannot recv data: Input/output error
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.545 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.546 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.546 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 23 11:24:28 compute-0 podman[218444]: 2026-02-23 11:24:28.595023893 +0000 UTC m=+0.065093210 container remove 01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.599 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[476a5bb3-3538-41e0-8b10-2fe97f2185da]: (4, ('Mon Feb 23 11:24:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c (01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873)\n01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873\nMon Feb 23 11:24:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c (01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873)\n01fac061968ed4de9b667cf2346c1357b2c636bd52dff2042d5cf4454bd19873\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.601 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[dc334419-1ff1-4e88-a70b-7591f581dc9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.602 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44df174f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.605 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:28 compute-0 kernel: tap44df174f-c0: left promiscuous mode
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.611 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.611 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.614 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc3d78f-f883-4ce7-bf3a-ba63105e2ba6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.640 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[d16b7e38-a28b-4e02-b0d7-48ee17105fe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.641 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[60837227-f095-48dd-a187-90f6be1a7e1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.657 208487 DEBUG oslo.privsep.daemon [-] privsep: reply[f1619134-8ed3-47b3-bd60-23829e605cf2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497318, 'reachable_time': 33268, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218474, 'error': None, 'target': 'ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.661 107369 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44df174f-c509-4e62-8ffe-50240bf4471c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 11:24:28 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:24:28.661 107369 DEBUG oslo.privsep.daemon [-] privsep: reply[db186fd5-9426-40d6-a9a3-3b8b02bbd1af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 11:24:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d44df174f\x2dc509\x2d4e62\x2d8ffe\x2d50240bf4471c.mount: Deactivated successfully.
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.891 187643 DEBUG nova.compute.manager [req-6dbdd619-4b85-45fb-b5cd-0e6178a3a360 req-2a293505-e448-4937-af2a-6f9d90b7551b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-unplugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.891 187643 DEBUG oslo_concurrency.lockutils [req-6dbdd619-4b85-45fb-b5cd-0e6178a3a360 req-2a293505-e448-4937-af2a-6f9d90b7551b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.892 187643 DEBUG oslo_concurrency.lockutils [req-6dbdd619-4b85-45fb-b5cd-0e6178a3a360 req-2a293505-e448-4937-af2a-6f9d90b7551b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.892 187643 DEBUG oslo_concurrency.lockutils [req-6dbdd619-4b85-45fb-b5cd-0e6178a3a360 req-2a293505-e448-4937-af2a-6f9d90b7551b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.892 187643 DEBUG nova.compute.manager [req-6dbdd619-4b85-45fb-b5cd-0e6178a3a360 req-2a293505-e448-4937-af2a-6f9d90b7551b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] No waiting events found dispatching network-vif-unplugged-b070b4ba-b956-47f5-be40-6adbf98ac276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:24:28 compute-0 nova_compute[187639]: 2026-02-23 11:24:28.892 187643 DEBUG nova.compute.manager [req-6dbdd619-4b85-45fb-b5cd-0e6178a3a360 req-2a293505-e448-4937-af2a-6f9d90b7551b 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-unplugged-b070b4ba-b956-47f5-be40-6adbf98ac276 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.147 187643 DEBUG nova.network.neutron [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Activated binding for port b070b4ba-b956-47f5-be40-6adbf98ac276 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.147 187643 DEBUG nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.148 187643 DEBUG nova.virt.libvirt.vif [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-23T11:23:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-426222971',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-426222971',id=29,image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T11:23:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='12cbe83eae144acf8af44a253e12c69e',ramdisk_id='',reservation_id='r-ynmpvtiq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='0ef805b1-b4a6-4839-ade3-d18a6c4b570e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-777871079',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-777871079-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T11:24:17Z,user_data=None,user_id='c83e20651b24489da73abd4b530fb971',uuid=194d7361-7d98-4642-8052-446791060f8a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.148 187643 DEBUG nova.network.os_vif_util [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converting VIF {"id": "b070b4ba-b956-47f5-be40-6adbf98ac276", "address": "fa:16:3e:ee:cf:53", "network": {"id": "44df174f-c509-4e62-8ffe-50240bf4471c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1467119249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12cbe83eae144acf8af44a253e12c69e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb070b4ba-b9", "ovs_interfaceid": "b070b4ba-b956-47f5-be40-6adbf98ac276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.149 187643 DEBUG nova.network.os_vif_util [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.149 187643 DEBUG os_vif [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.151 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.152 187643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb070b4ba-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.153 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.157 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.161 187643 INFO os_vif [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:cf:53,bridge_name='br-int',has_traffic_filtering=True,id=b070b4ba-b956-47f5-be40-6adbf98ac276,network=Network(44df174f-c509-4e62-8ffe-50240bf4471c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb070b4ba-b9')
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.161 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.162 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.162 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.163 187643 DEBUG nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.163 187643 INFO nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Deleting instance files /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a_del
Feb 23 11:24:29 compute-0 nova_compute[187639]: 2026-02-23 11:24:29.164 187643 INFO nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Deletion of /var/lib/nova/instances/194d7361-7d98-4642-8052-446791060f8a_del complete
Feb 23 11:24:29 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 11:24:29 compute-0 podman[197002]: time="2026-02-23T11:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:24:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:24:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 23 11:24:30 compute-0 nova_compute[187639]: 2026-02-23 11:24:30.611 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:30 compute-0 podman[218476]: 2026-02-23 11:24:30.886579334 +0000 UTC m=+0.078056110 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, release=1770267347)
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.011 187643 DEBUG nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.012 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.013 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.013 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.013 187643 DEBUG nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] No waiting events found dispatching network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.014 187643 WARNING nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received unexpected event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 for instance with vm_state active and task_state migrating.
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.014 187643 DEBUG nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.014 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.015 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.015 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.016 187643 DEBUG nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] No waiting events found dispatching network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.016 187643 WARNING nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received unexpected event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 for instance with vm_state active and task_state migrating.
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.016 187643 DEBUG nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.017 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.017 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.018 187643 DEBUG oslo_concurrency.lockutils [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.018 187643 DEBUG nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] No waiting events found dispatching network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 11:24:31 compute-0 nova_compute[187639]: 2026-02-23 11:24:31.018 187643 WARNING nova.compute.manager [req-ffdfc22e-f8ff-4a1c-9beb-ddc2a5406cc9 req-45e19d57-840d-4927-af3d-a9853f8057f3 9382556b10444320a838ccb80f376dd8 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Received unexpected event network-vif-plugged-b070b4ba-b956-47f5-be40-6adbf98ac276 for instance with vm_state active and task_state migrating.
Feb 23 11:24:31 compute-0 openstack_network_exporter[199919]: ERROR   11:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:24:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:24:31 compute-0 openstack_network_exporter[199919]: ERROR   11:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:24:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:24:34 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 23 11:24:34 compute-0 systemd[218330]: Activating special unit Exit the Session...
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped target Main User Target.
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped target Basic System.
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped target Paths.
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped target Sockets.
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped target Timers.
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 11:24:34 compute-0 systemd[218330]: Closed D-Bus User Message Bus Socket.
Feb 23 11:24:34 compute-0 systemd[218330]: Stopped Create User's Volatile Files and Directories.
Feb 23 11:24:34 compute-0 systemd[218330]: Removed slice User Application Slice.
Feb 23 11:24:34 compute-0 systemd[218330]: Reached target Shutdown.
Feb 23 11:24:34 compute-0 systemd[218330]: Finished Exit the Session.
Feb 23 11:24:34 compute-0 systemd[218330]: Reached target Exit the Session.
Feb 23 11:24:34 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 23 11:24:34 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 23 11:24:34 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 23 11:24:34 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 23 11:24:34 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 23 11:24:34 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 23 11:24:34 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.156 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.785 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "194d7361-7d98-4642-8052-446791060f8a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.786 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.786 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "194d7361-7d98-4642-8052-446791060f8a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.804 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.804 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.805 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.805 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.950 187643 WARNING nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.951 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5782MB free_disk=73.20068740844727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.952 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.952 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:24:34 compute-0 nova_compute[187639]: 2026-02-23 11:24:34.996 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration for instance 194d7361-7d98-4642-8052-446791060f8a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.018 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.047 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Migration 6405d905-5a72-40e4-a625-73383c777b00 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.047 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.047 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.091 187643 DEBUG nova.compute.provider_tree [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.105 187643 DEBUG nova.scheduler.client.report [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.126 187643 DEBUG nova.compute.resource_tracker [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.127 187643 DEBUG oslo_concurrency.lockutils [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.132 187643 INFO nova.compute.manager [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.207 187643 INFO nova.scheduler.client.report [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] Deleted allocation for migration 6405d905-5a72-40e4-a625-73383c777b00
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.208 187643 DEBUG nova.virt.libvirt.driver [None req-7d903424-840b-4880-9a6f-d8fb457085fb a34fa0910e7245429f0005c95fd8a714 f7956db4065f43239f51428ae2328f4a - - default default] [instance: 194d7361-7d98-4642-8052-446791060f8a] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 23 11:24:35 compute-0 nova_compute[187639]: 2026-02-23 11:24:35.613 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:36 compute-0 sshd-session[218500]: Invalid user user from 143.198.30.3 port 58788
Feb 23 11:24:36 compute-0 sshd-session[218500]: Connection closed by invalid user user 143.198.30.3 port 58788 [preauth]
Feb 23 11:24:39 compute-0 nova_compute[187639]: 2026-02-23 11:24:39.158 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:39 compute-0 nova_compute[187639]: 2026-02-23 11:24:39.446 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:24:40 compute-0 nova_compute[187639]: 2026-02-23 11:24:40.613 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:41 compute-0 podman[218502]: 2026-02-23 11:24:41.855025276 +0000 UTC m=+0.054737398 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 11:24:43 compute-0 nova_compute[187639]: 2026-02-23 11:24:43.539 187643 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771845868.5382419, 194d7361-7d98-4642-8052-446791060f8a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 11:24:43 compute-0 nova_compute[187639]: 2026-02-23 11:24:43.539 187643 INFO nova.compute.manager [-] [instance: 194d7361-7d98-4642-8052-446791060f8a] VM Stopped (Lifecycle Event)
Feb 23 11:24:43 compute-0 nova_compute[187639]: 2026-02-23 11:24:43.560 187643 DEBUG nova.compute.manager [None req-0f41a781-e761-442a-b509-79d0a0717c08 - - - - - -] [instance: 194d7361-7d98-4642-8052-446791060f8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 11:24:44 compute-0 nova_compute[187639]: 2026-02-23 11:24:44.160 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:45 compute-0 nova_compute[187639]: 2026-02-23 11:24:45.614 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:46 compute-0 podman[218528]: 2026-02-23 11:24:46.870873949 +0000 UTC m=+0.062078761 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216)
Feb 23 11:24:49 compute-0 nova_compute[187639]: 2026-02-23 11:24:49.162 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:50 compute-0 nova_compute[187639]: 2026-02-23 11:24:50.661 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:54 compute-0 nova_compute[187639]: 2026-02-23 11:24:54.164 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:55 compute-0 nova_compute[187639]: 2026-02-23 11:24:55.662 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:58 compute-0 podman[218548]: 2026-02-23 11:24:58.894805439 +0000 UTC m=+0.093714392 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 11:24:59 compute-0 nova_compute[187639]: 2026-02-23 11:24:59.166 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:24:59 compute-0 podman[197002]: time="2026-02-23T11:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:24:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:24:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 23 11:25:00 compute-0 nova_compute[187639]: 2026-02-23 11:25:00.664 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:01 compute-0 openstack_network_exporter[199919]: ERROR   11:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:25:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:25:01 compute-0 openstack_network_exporter[199919]: ERROR   11:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:25:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:25:01 compute-0 podman[218574]: 2026-02-23 11:25:01.86892464 +0000 UTC m=+0.056732901 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, version=9.7, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1770267347)
Feb 23 11:25:02 compute-0 sshd-session[218595]: Invalid user admin from 165.227.79.48 port 49122
Feb 23 11:25:02 compute-0 sshd-session[218595]: Connection closed by invalid user admin 165.227.79.48 port 49122 [preauth]
Feb 23 11:25:04 compute-0 nova_compute[187639]: 2026-02-23 11:25:04.195 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:05 compute-0 nova_compute[187639]: 2026-02-23 11:25:05.667 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:08 compute-0 ovn_controller[97601]: 2026-02-23T11:25:08Z|00229|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 23 11:25:09 compute-0 nova_compute[187639]: 2026-02-23 11:25:09.197 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:09 compute-0 sshd-session[218597]: Invalid user user from 143.198.30.3 port 44518
Feb 23 11:25:09 compute-0 sshd-session[218597]: Connection closed by invalid user user 143.198.30.3 port 44518 [preauth]
Feb 23 11:25:10 compute-0 nova_compute[187639]: 2026-02-23 11:25:10.705 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:25:12.670 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:25:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:25:12.671 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:25:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:25:12.671 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:25:12 compute-0 podman[218599]: 2026-02-23 11:25:12.858247479 +0000 UTC m=+0.051818752 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:25:13 compute-0 nova_compute[187639]: 2026-02-23 11:25:13.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:14 compute-0 nova_compute[187639]: 2026-02-23 11:25:14.242 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:15 compute-0 nova_compute[187639]: 2026-02-23 11:25:15.706 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:16 compute-0 nova_compute[187639]: 2026-02-23 11:25:16.386 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:16 compute-0 nova_compute[187639]: 2026-02-23 11:25:16.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:17 compute-0 nova_compute[187639]: 2026-02-23 11:25:17.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:17 compute-0 nova_compute[187639]: 2026-02-23 11:25:17.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:25:17 compute-0 nova_compute[187639]: 2026-02-23 11:25:17.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:25:17 compute-0 nova_compute[187639]: 2026-02-23 11:25:17.718 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:25:17 compute-0 podman[218624]: 2026-02-23 11:25:17.880530741 +0000 UTC m=+0.083190475 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 11:25:18 compute-0 nova_compute[187639]: 2026-02-23 11:25:18.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:18 compute-0 nova_compute[187639]: 2026-02-23 11:25:18.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:18 compute-0 nova_compute[187639]: 2026-02-23 11:25:18.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:25:19 compute-0 nova_compute[187639]: 2026-02-23 11:25:19.244 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:20 compute-0 nova_compute[187639]: 2026-02-23 11:25:20.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:20 compute-0 nova_compute[187639]: 2026-02-23 11:25:20.749 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:23 compute-0 nova_compute[187639]: 2026-02-23 11:25:23.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:24 compute-0 nova_compute[187639]: 2026-02-23 11:25:24.246 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:25 compute-0 nova_compute[187639]: 2026-02-23 11:25:25.751 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:25:26.155 106968 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:05:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:03:64:5c:da:fe'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.156 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:26 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:25:26.158 106968 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.686 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.690 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.724 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.725 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.726 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.726 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.934 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.935 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5800MB free_disk=73.20072174072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.936 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:25:26 compute-0 nova_compute[187639]: 2026-02-23 11:25:26.936 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:25:27 compute-0 nova_compute[187639]: 2026-02-23 11:25:27.017 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:25:27 compute-0 nova_compute[187639]: 2026-02-23 11:25:27.018 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:25:27 compute-0 nova_compute[187639]: 2026-02-23 11:25:27.047 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:25:27 compute-0 nova_compute[187639]: 2026-02-23 11:25:27.064 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:25:27 compute-0 nova_compute[187639]: 2026-02-23 11:25:27.066 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:25:27 compute-0 nova_compute[187639]: 2026-02-23 11:25:27.067 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:25:29 compute-0 nova_compute[187639]: 2026-02-23 11:25:29.248 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:29 compute-0 podman[197002]: time="2026-02-23T11:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:25:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:25:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 23 11:25:29 compute-0 podman[218646]: 2026-02-23 11:25:29.882118295 +0000 UTC m=+0.077278680 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 23 11:25:30 compute-0 nova_compute[187639]: 2026-02-23 11:25:30.770 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:31 compute-0 openstack_network_exporter[199919]: ERROR   11:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:25:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:25:31 compute-0 openstack_network_exporter[199919]: ERROR   11:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:25:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:25:32 compute-0 podman[218672]: 2026-02-23 11:25:32.864369979 +0000 UTC m=+0.060419167 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 11:25:34 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:25:34.160 106968 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=260ff7a6-2911-481e-914f-54dc92f9c3bf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 11:25:34 compute-0 nova_compute[187639]: 2026-02-23 11:25:34.249 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:35 compute-0 nova_compute[187639]: 2026-02-23 11:25:35.772 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:39 compute-0 nova_compute[187639]: 2026-02-23 11:25:39.251 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:40 compute-0 nova_compute[187639]: 2026-02-23 11:25:40.774 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:41 compute-0 sshd-session[218693]: Invalid user user from 143.198.30.3 port 51410
Feb 23 11:25:41 compute-0 sshd-session[218693]: Connection closed by invalid user user 143.198.30.3 port 51410 [preauth]
Feb 23 11:25:43 compute-0 podman[218695]: 2026-02-23 11:25:43.86224943 +0000 UTC m=+0.065060369 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 11:25:44 compute-0 nova_compute[187639]: 2026-02-23 11:25:44.253 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:45 compute-0 sshd-session[218721]: Invalid user admin from 165.227.79.48 port 35696
Feb 23 11:25:45 compute-0 sshd-session[218721]: Connection closed by invalid user admin 165.227.79.48 port 35696 [preauth]
Feb 23 11:25:45 compute-0 nova_compute[187639]: 2026-02-23 11:25:45.775 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:48 compute-0 podman[218723]: 2026-02-23 11:25:48.856578259 +0000 UTC m=+0.051682718 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 23 11:25:49 compute-0 nova_compute[187639]: 2026-02-23 11:25:49.255 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:50 compute-0 nova_compute[187639]: 2026-02-23 11:25:50.822 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:52 compute-0 ovn_controller[97601]: 2026-02-23T11:25:52Z|00230|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Feb 23 11:25:54 compute-0 nova_compute[187639]: 2026-02-23 11:25:54.257 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:55 compute-0 nova_compute[187639]: 2026-02-23 11:25:55.825 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:59 compute-0 nova_compute[187639]: 2026-02-23 11:25:59.259 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:25:59 compute-0 podman[197002]: time="2026-02-23T11:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:25:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:25:59 compute-0 podman[197002]: @ - - [23/Feb/2026:11:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 23 11:26:00 compute-0 nova_compute[187639]: 2026-02-23 11:26:00.827 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:00 compute-0 podman[218744]: 2026-02-23 11:26:00.865650039 +0000 UTC m=+0.069601218 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 11:26:01 compute-0 openstack_network_exporter[199919]: ERROR   11:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:26:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:26:01 compute-0 openstack_network_exporter[199919]: ERROR   11:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:26:01 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:26:03 compute-0 podman[218770]: 2026-02-23 11:26:03.86757509 +0000 UTC m=+0.066085226 container health_status ef263360800808ef45daa921bbccf57e229a38d21cba4b33ceeb73d14067af7a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, architecture=x86_64)
Feb 23 11:26:04 compute-0 nova_compute[187639]: 2026-02-23 11:26:04.261 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:05 compute-0 nova_compute[187639]: 2026-02-23 11:26:05.828 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:06 compute-0 sshd-session[218791]: Connection closed by authenticating user root 185.156.73.233 port 34890 [preauth]
Feb 23 11:26:09 compute-0 nova_compute[187639]: 2026-02-23 11:26:09.263 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:10 compute-0 nova_compute[187639]: 2026-02-23 11:26:10.832 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:26:12.671 106968 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:26:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:26:12.672 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:26:12 compute-0 ovn_metadata_agent[106963]: 2026-02-23 11:26:12.672 106968 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:26:14 compute-0 nova_compute[187639]: 2026-02-23 11:26:14.068 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:14 compute-0 sshd-session[218793]: Invalid user user from 143.198.30.3 port 37140
Feb 23 11:26:14 compute-0 sshd-session[218793]: Connection closed by invalid user user 143.198.30.3 port 37140 [preauth]
Feb 23 11:26:14 compute-0 podman[218795]: 2026-02-23 11:26:14.206338017 +0000 UTC m=+0.069103535 container health_status d66eee93f5bd45508bab877627ccf0d5ade5cce8183f94705b202294765f3771 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 11:26:14 compute-0 nova_compute[187639]: 2026-02-23 11:26:14.265 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:15 compute-0 nova_compute[187639]: 2026-02-23 11:26:15.869 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:17 compute-0 nova_compute[187639]: 2026-02-23 11:26:17.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:17 compute-0 nova_compute[187639]: 2026-02-23 11:26:17.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 11:26:17 compute-0 nova_compute[187639]: 2026-02-23 11:26:17.692 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 11:26:17 compute-0 nova_compute[187639]: 2026-02-23 11:26:17.726 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 23 11:26:17 compute-0 nova_compute[187639]: 2026-02-23 11:26:17.727 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:19 compute-0 nova_compute[187639]: 2026-02-23 11:26:19.267 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:19 compute-0 nova_compute[187639]: 2026-02-23 11:26:19.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:19 compute-0 nova_compute[187639]: 2026-02-23 11:26:19.691 187643 DEBUG nova.compute.manager [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 11:26:19 compute-0 podman[218819]: 2026-02-23 11:26:19.864966596 +0000 UTC m=+0.058210729 container health_status 7959d8e386cb1254e43d7204da8e2920504cf9cdf11c5e0f900153b8f31a16a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 11:26:20 compute-0 nova_compute[187639]: 2026-02-23 11:26:20.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:20 compute-0 nova_compute[187639]: 2026-02-23 11:26:20.692 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:20 compute-0 nova_compute[187639]: 2026-02-23 11:26:20.870 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:23 compute-0 sshd-session[218838]: Accepted publickey for zuul from 192.168.122.10 port 37814 ssh2: ECDSA SHA256:Eb6YhWE8gMAIItNYB6fv+se9MmEBZad9zieK/h3FBTg
Feb 23 11:26:23 compute-0 systemd-logind[808]: New session 47 of user zuul.
Feb 23 11:26:23 compute-0 systemd[1]: Started Session 47 of User zuul.
Feb 23 11:26:23 compute-0 sshd-session[218838]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 23 11:26:23 compute-0 sudo[218842]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 23 11:26:23 compute-0 sudo[218842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 11:26:23 compute-0 nova_compute[187639]: 2026-02-23 11:26:23.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:24 compute-0 nova_compute[187639]: 2026-02-23 11:26:24.269 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:25 compute-0 nova_compute[187639]: 2026-02-23 11:26:25.904 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:27 compute-0 ovs-vsctl[219011]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.691 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.715 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.715 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.715 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.716 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 11:26:27 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 218866 (sos)
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.827 187643 WARNING nova.virt.libvirt.driver [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.829 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.20006942749023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.829 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.829 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 11:26:27 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 23 11:26:27 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.883 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.883 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.897 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing inventories for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.913 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating ProviderTree inventory for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.914 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Updating inventory in ProviderTree for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.927 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing aggregate associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.953 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Refreshing trait associations for resource provider 8ecb3de0-8241-4d60-9a57-9609e064b906, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.970 187643 DEBUG nova.compute.provider_tree [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed in ProviderTree for provider: 8ecb3de0-8241-4d60-9a57-9609e064b906 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.991 187643 DEBUG nova.scheduler.client.report [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Inventory has not changed for provider 8ecb3de0-8241-4d60-9a57-9609e064b906 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.992 187643 DEBUG nova.compute.resource_tracker [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 11:26:27 compute-0 nova_compute[187639]: 2026-02-23 11:26:27.992 187643 DEBUG oslo_concurrency.lockutils [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 11:26:28 compute-0 virtqemud[186733]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 23 11:26:28 compute-0 virtqemud[186733]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 23 11:26:28 compute-0 virtqemud[186733]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 23 11:26:28 compute-0 crontab[219410]: (root) LIST (root)
Feb 23 11:26:28 compute-0 nova_compute[187639]: 2026-02-23 11:26:28.988 187643 DEBUG oslo_service.periodic_task [None req-dcf9ea40-5bc9-4a35-9ee3-598dedef8c1a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 11:26:29 compute-0 nova_compute[187639]: 2026-02-23 11:26:29.271 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:29 compute-0 sshd-session[219477]: Invalid user admin from 165.227.79.48 port 33774
Feb 23 11:26:29 compute-0 sshd-session[219477]: Connection closed by invalid user admin 165.227.79.48 port 33774 [preauth]
Feb 23 11:26:29 compute-0 podman[197002]: time="2026-02-23T11:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 11:26:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 23 11:26:29 compute-0 podman[197002]: @ - - [23/Feb/2026:11:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 23 11:26:30 compute-0 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 23 11:26:30 compute-0 systemd[1]: Starting Hostname Service...
Feb 23 11:26:30 compute-0 systemd[1]: Started Hostname Service.
Feb 23 11:26:30 compute-0 nova_compute[187639]: 2026-02-23 11:26:30.956 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 11:26:31 compute-0 openstack_network_exporter[199919]: ERROR   11:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 11:26:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:26:31 compute-0 openstack_network_exporter[199919]: ERROR   11:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 11:26:31 compute-0 openstack_network_exporter[199919]: 
Feb 23 11:26:31 compute-0 podman[219659]: 2026-02-23 11:26:31.917409135 +0000 UTC m=+0.114035685 container health_status 13f263d0165e0035fe3c42294d44e9567937bee37331245690f612f5fe976be1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cfb8a2fedbb04ce727863bd4e005e57158a82fd5810c7fe5f7cc1b316a2e7593-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c-b6322698acf71fc8354019ed05314ddbddae1e4e53a7be87a0b7735841afcc0c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 23 11:26:34 compute-0 nova_compute[187639]: 2026-02-23 11:26:34.274 187643 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
